Install Varnish Cache with Nginx on Rocky Linux 9: Best Reverse Proxy
In this guide, we’ll walk you through the process to Install Varnish Cache with Nginx on Rocky Linux 9. Varnish is a powerful reverse proxy that caches web content in memory, serving it quickly to subsequent visitors. This is achieved by creating copies of web pages based on request headers such as the URL and request method, a technique known as Full Page Cache.
Varnish Cache is a game-changer for websites, especially e-commerce platforms. It drastically reduces page load times, leading to a faster and more responsive user experience, which is crucial for retaining visitors and boosting conversions.
Follow the steps below to set up Varnish Cache with Nginx on your Rocky Linux 9 server.
Prerequisites
Before you Install Varnish Cache with Nginx on Rocky Linux 9, ensure you have the following:
- A Rocky Linux 9 Server: You should be logged in as a non-root user with sudo privileges. Refer to this guide for initial server setup: Initial Server Setup with Rocky Linux 9.
- Nginx Web Server Installed: Nginx must be installed and configured on your server. Use this guide for Nginx installation: How To Install Nginx on Rocky Linux 9.
Once you’ve met these requirements, proceed with the following steps:
1. Install Varnish Cache on Rocky Linux 9
First, update your system’s package index:
sudo dnf -y update
Disable any pre-existing Varnish modules:
sudo dnf module disable varnish
Install the EPEL (Extra Packages for Enterprise Linux) repository:
# . /etc/os-release
# sudo dnf install https://dl.fedoraproject.org/pub/epel/epel-release-latest-${VERSION_ID%%.*}.noarch.rpm
Download Varnish
Visit the Varnish Downloads page to find the latest release and add the Varnish Cache repository to your system using the curl
command:
curl -s https://packagecloud.io/install/repositories/varnishcache/varnish72/script.rpm.sh | sudo bash
Upon successful download, you’ll see the following output:
**Output**
The repository is setup! You can now install packages.
Install the Varnish package:
sudo dnf install varnish -y
Manage Varnish Cache Service
Start and enable the Varnish service to ensure it starts automatically on boot:
sudo systemctl start varnish
sudo systemctl enable varnish
Verify the Varnish Cache service is active and running:
sudo systemctl status varnish
[Image of Varnish cache service status]
2. Configure Varnish on Rocky Linux 9
By default, Varnish listens on port 6081
. Since Varnish will act as a front-end proxy, intercepting all HTTP requests, you need to change the listening port to 80
.
Open the Varnishd
file with your preferred text editor (e.g., vi):
sudo vi /usr/lib/systemd/system/varnish.service
Locate the ExecStart
line and modify the -a
port to 80
:
...
ExecStart=/usr/sbin/varnishd -a :**80**
...
Save and close the file.
Reload the system daemon to apply the changes:
sudo systemctl daemon-reload
3. Configure Nginx to Work with Varnish Cache
Next, configure Nginx to listen on port 8080
instead of 80
to avoid conflicts with Varnish. Open the Nginx configuration file:
sudo vi /etc/nginx/nginx.conf
Change the listening port to 8080
:
server {
listen **8080** default_server;
listen [::]:**8080** default_server;
.....
Save and close the file.
Restart Nginx to apply the changes:
sudo systemctl restart nginx
4. Configure Firewall For Varnish Cache
Open port 8080
in the firewall:
# sudo firewall-cmd --zone=public --add-port=8080/tcp --permanent
# sudo firewall-cmd --reload
Ensure Nginx is configured as the backend server for Varnish. Edit the Varnish configuration file:
sudo vi /etc/varnish/default.vcl
The backend configuration should look like this:
backend default {
.host = "127.0.0.1";
.port = "8080";
}
5. Testing Varnish Cache Nginx
Finally, verify that Varnish is caching content correctly using the curl
command:
curl -I http://localhost
[Image of Testing Varnish Cache Nginx 1]
This will display the HTTP header information. If you run the command again, you’ll notice the Age
header, indicating that the response was served from the Varnish cache:
curl -I http://localhost
[Image of Testing Varnish Cache Nginx 2]
This configuration should work for valid domain names with a correctly configured DNS A record.
Conclusion
Install Varnish Cache with Nginx on Rocky Linux 9 allows you to drastically improve the performance of your web server by efficiently caching and serving content. You have now successfully configured Varnish Cache Nginx on Rocky Linux 9.
You might also find these articles helpful:
- Install LEMP Stack on Rocky Linux 9
- Install Caddy Web Server on Rocky Linux 9
- Installing GitLab on Rocky Linux 9
- Zoom Meeting App Rocky Linux 9
- Run Apache Solr Rocky Linux 9
- PHP ionCube Loader Rocky Linux 9
- Install Apache Maven Rocky Linux 9
Alternative Solutions for Caching with Nginx
While Varnish is an excellent choice, other solutions can achieve similar caching results. Here are two alternatives for caching with Nginx on Rocky Linux 9.
1. Nginx’s Built-in Caching
Nginx itself has built-in caching capabilities that can be used to cache static and dynamic content. This approach eliminates the need for a separate caching proxy like Varnish, simplifying the architecture.
Explanation:
Nginx’s proxy cache stores responses from backend servers and serves them directly to clients. This is configured within the nginx.conf
file. The cache is stored on disk, and Nginx manages its size and expiration. While not as performant as Varnish’s in-memory caching, it’s a good option for simpler setups or when memory resources are limited.
Configuration:
-
Define a Cache Zone: In the
http
block of yournginx.conf
file, define a cache zone:http { proxy_cache_path /var/cache/nginx levels=1:2 keys_zone=my_cache:10m max_size=10g inactive=60m use_temp_path=off; proxy_cache_key "$scheme$request_method$host$request_uri"; server { # ... your server configuration ... } }
proxy_cache_path
: Specifies the directory to store the cache, cache hierarchy levels, a shared memory zone name (my_cache
) for storing cache keys and metadata, the maximum cache size, and how long inactive cache items should remain.use_temp_path=off
disables temporary file copying during cache writes.proxy_cache_key
: Defines the key used to identify cache entries.
-
Enable Caching for Specific Locations: In the
server
block, enable caching for specific locations by adding the following to the location block:location / { proxy_pass http://backend; # Replace with your backend server address proxy_cache my_cache; proxy_cache_valid 200 302 60m; # Cache successful responses (200, 302) for 60 minutes proxy_cache_valid 404 1m; # Cache 404 errors for 1 minute proxy_cache_use_stale error timeout invalid_header updating; # Serve stale content in case of errors or timeouts proxy_cache_background_update on; # Allow updating stale cache in the background proxy_cache_lock on; # Prevent multiple clients from requesting the same uncached content add_header X-Cache-Status $upstream_cache_status; # Add a header to show the cache status (HIT, MISS, BYPASS) }
proxy_pass
: Specifies the address of the backend server.proxy_cache
: Enables the cache zone defined earlier.proxy_cache_valid
: Sets the caching duration for different HTTP response codes.proxy_cache_use_stale
: Defines when stale content can be served.proxy_cache_background_update
: Allows stale cache to be updated in the background.proxy_cache_lock
: Prevents multiple simultaneous requests to the backend server when the content is not cached.add_header
: Adds anX-Cache-Status
header to the response, indicating whether the content was served from the cache.
-
Backend Server Configuration It’s essential to configure your backend server to return proper cache-control headers. This allows Nginx to accurately cache content and respect expiration times.
Advantages:
- Simpler configuration.
- No need for additional software.
- Reduced resource consumption compared to running a separate caching proxy.
Disadvantages:
- Disk-based caching is slower than Varnish’s in-memory caching.
- Less flexibility in caching policies compared to Varnish.
2. Cloudflare (or similar CDN)
Using a Content Delivery Network (CDN) like Cloudflare is another effective way to cache content and improve website performance. CDNs distribute your website’s content across multiple servers located around the world, ensuring that users can access your content quickly from a server close to their location.
Explanation:
Cloudflare acts as a reverse proxy and caches static content (images, CSS, JavaScript) on its global network. It also provides other benefits, such as DDoS protection and SSL/TLS encryption. For dynamic content, Cloudflare can be configured to cache specific pages based on rules you define.
Configuration:
- Sign up for Cloudflare: Create an account on the Cloudflare website and add your domain.
- Update DNS Records: Follow Cloudflare’s instructions to update your domain’s DNS records to point to Cloudflare’s name servers.
-
Configure Caching Rules: In the Cloudflare dashboard, configure caching rules to specify which content should be cached and for how long. You can use Page Rules to customize caching behavior based on URL patterns. For example:
- To cache all static assets aggressively:
*yourdomain.com/wp-content/uploads/*
with a cache level of "Cache Everything" and a browser cache TTL of "1 month". - To bypass the cache for the WordPress admin area:
*yourdomain.com/wp-admin*
with a cache level of "Bypass Cache".
- To cache all static assets aggressively:
- Enable Browser Cache TTL: Set the Browser Cache TTL (Time To Live) to instruct browsers how long to cache static assets.
- Consider Using APO (Automatic Platform Optimization): For WordPress sites, Cloudflare’s APO feature can significantly improve performance by caching dynamic content on their edge network.
Advantages:
- Global content delivery network for faster loading times worldwide.
- DDoS protection and security features.
- Easy to set up and manage.
Disadvantages:
- Reliance on a third-party service.
- Cost can vary depending on the plan and features used.
- Limited control over caching policies compared to self-hosted solutions like Varnish or Nginx’s built-in caching.
By exploring these alternative solutions, you can choose the caching strategy that best fits your specific needs and technical expertise.