Install Varnish Cache with Nginx on Ubuntu 22.04 with Easy Steps
This tutorial aims to guide you through the process of installing Varnish Cache with Nginx on Ubuntu 22.04. Varnish Cache, often referred to as a caching HTTP reverse proxy, acts as a web application accelerator. Think of it as a middleman that sits between your clients (users) and your web server. Instead of your web server handling every request for content directly, Varnish takes on this responsibility. Follow the steps below to complete the Varnish Cache Server setup with the Nginx web server on Ubuntu 22.04. Setting up Install Varnish Cache with Nginx on Ubuntu 22.04 can significantly improve your website performance.
Before you begin, ensure you have the following prerequisites:
- A server running Ubuntu 22.04.
- A non-root user with sudo privileges.
- A basic firewall configured. (Refer to the Initial Server Setup with Ubuntu 22.04 guide if needed.)
- Nginx web server installed. (Refer to the How To Install Nginx on Ubuntu 22.04 guide if needed.)
1. Configure Nginx for Varnish Cache Server
First, update your local package index with the following command:
sudo apt update
The default port for Nginx is 80. To integrate with Varnish, you need to change it to 8080.
To do this, use the following command:
sudo find /etc/nginx/sites-enabled -name '*.conf' -exec sed -r -i 's/blisten ([^:]+:)?80b([^;]*);/listen 180802;/g' {} ';'
The above script will edit any config in the path /etc/nginx/sites-enabled
.
Also, you may need to edit the default Nginx site to listen on port 8080. Open the file with your favorite text editor (here, we use vi):
sudo vi /etc/nginx/sites-enabled/default
Apply the changes as shown below:
......
server {
listen <strong><mark>8080</mark></strong> default_server;
#listen [::]:80 default_server;
# SSL configuration
#
When you are done, save and close the file.
Now restart Nginx to apply the changes with the following command:
sudo systemctl restart nginx
Then, verify that Nginx is listening on port 8080 with the following command:
sudo netstat -pnlt | grep 8080
In your output you should see:

Now you can start to install Varnish Cache on Ubuntu 22.04.
2. Install Varnish Cache Server on Ubuntu 22.04
To install the Varnish Cache, you need to add the official Varnish Cache repository.
First, add the required dependencies with the following command:
sudo apt install debian-archive-keyring curl gnupg apt-transport-https -y
Add Varnish Cache 7.3 GPG Key
Then, add the GPG key for the package with the following curl command:
curl -fsSL https://packagecloud.io/varnishcache/varnish73/gpgkey | sudo gpg --dearmor -o /etc/apt/trusted.gpg.d/varnish.gpg
Add Varnish Cache 7.3 Repository
Now add the Varnish Cache repository with the command below:
sudo tee /etc/apt/sources.list.d/varnishcache_varnish73.list > /dev/null <<-EOF
deb https://packagecloud.io/varnishcache/varnish73/ubuntu/ focal main
deb-src https://packagecloud.io/varnishcache/varnish73/ubuntu/ focal main
EOF
Update your local package index:
sudo apt update
Now you can install Varnish on Ubuntu 22.04 with the following command:
sudo apt install varnish -y
Here you have Varnish Cache installed on your server. Let’s see how to configure it.
3. Configure Varnish Cache on Ubuntu 22.04
At this point, you need to check the default address and port configuration. Open the Varnish configuration file with your favorite text editor, here we use vi:
sudo vi /etc/varnish/default.vcl
On the “backend default” section be sure that it looks like this:
backend default {
.host = "127.0.0.1";
.port = "8080";
}
When you are done, save and close the file.
Now you need to configure Varnish to listen at port 80 instead of the default of 6081.
To be able to manage Varnish Cache 7 like other system services, we will adjust the systemd service as below.
sudo cp /lib/systemd/system/varnish.service /etc/systemd/system/
cat /etc/systemd/system/varnish.service
Then, edit the file and change the default port to port 80 and the cache size to 2GB.
sudo vi /etc/systemd/system/varnish.service
ExecStart=/usr/sbin/varnishd
-a :<mark>80</mark>
-a localhost:8443,PROXY
-p feature=+http2
-f /etc/varnish/default.vcl
-s malloc,<mark>2g</mark>
Save and close the file, when you are done.
To register the change reload the systemd with the following command:
sudo systemctl daemon-reload
Restart Varnish to apply the changes:
sudo systemctl restart varnish
Verify that Varnish is listening on port 80 with the following command:
sudo netstat -ltnp | grep ':80'
In your output you will see:
4. Testing Varnish Cache with Nginx
Now you can use the curl command to test the Varnish Cache on Ubuntu 22.04:
curl -I http://localhost/
Be sure that the X-Varnish: 2 and Via: 1.1 varnish (Varnish/7.3) headers appear in the output. This confirms that Varnish is successfully caching content. This is the final step to Install Varnish Cache with Nginx on Ubuntu 22.04.
Conclusion
Varnish Cache is a powerful web application accelerator that can speed up the delivery of web content and improve the performance of websites and APIs. At this point, you have learned to Install Varnish Cache with Nginx on Ubuntu 22.04.
Hope you enjoy it. Please subscribe to us on Facebook, Twitter, and YouTube.
Also, you may like to read the following articles:
How to set up PHP 7.3 Ubuntu 22.04
TCP congestion control algorithm Ubuntu 22.04
Go Language Setup Ubuntu 22.04
Install PostgreSQL Database Server on Ubuntu 22.04
Varnish Cache Server with Apache Ubuntu 22.04
Alternative Solutions for Caching with Nginx on Ubuntu 22.04
While Varnish provides excellent caching capabilities, other options exist for accelerating web application performance in conjunction with Nginx. Here are two alternative approaches:
1. Nginx Built-in Caching:
Nginx itself has built-in caching capabilities that can be configured without relying on external services like Varnish. This is a simpler solution for many use cases and can be very effective.
- Explanation: Nginx can cache both static and dynamic content. Static content caching is straightforward: Nginx stores copies of files (images, CSS, JavaScript) in memory or on disk and serves them directly to clients, bypassing the need to fetch them from the backend server. Dynamic content caching involves defining rules for how long and under what conditions Nginx should cache responses from your application.
-
Implementation: To enable caching in Nginx, you’ll need to configure the
proxy_cache_path
andproxy_cache
directives.First, define a cache zone:
http { proxy_cache_path /tmp/nginx_cache levels=1:2 keys_zone=my_cache:10m max_size=10g inactive=60m use_temp_path=off; ... }
/tmp/nginx_cache
: The directory where cached files will be stored. Important: This directory must exist and be writable by the Nginx user. For production environments, choose a more permanent location.levels=1:2
: Creates a two-level directory hierarchy under the cache directory, which helps prevent performance issues when a large number of files are stored in a single directory.keys_zone=my_cache:10m
: Defines a shared memory zone namedmy_cache
with a size of 10MB. This zone stores metadata about the cached items, such as keys and timestamps.max_size=10g
: Sets the maximum size of the cache to 10GB.inactive=60m
: Cached items that haven’t been accessed in 60 minutes will be evicted from the cache.use_temp_path=off
: Specifies that files should be written directly to the cache directory, rather than using a temporary directory. This avoids unnecessary copying.
Then, enable caching for specific locations:
server { location / { proxy_pass http://backend; proxy_cache my_cache; proxy_cache_valid 200 302 60m; # Cache successful responses for 60 minutes proxy_cache_valid 404 1m; # Cache 404 errors for 1 minute proxy_cache_use_stale error timeout updating invalid_header http_500 http_502 http_503 http_504; # Serve stale cache if backend is unavailable proxy_cache_background_update on; # Allow stale cache to be served while updating in the background proxy_cache_lock on; # Prevent multiple requests for the same uncached resource add_header X-Cache-Status $upstream_cache_status; # Add a header to indicate cache status (HIT, MISS, BYPASS) } }
proxy_pass http://backend
: Specifies the upstream server (replacehttp://backend
with the actual address of your backend server).proxy_cache my_cache
: Enables caching using themy_cache
zone defined earlier.proxy_cache_valid 200 302 60m
: Specifies that responses with HTTP status codes 200 (OK) and 302 (Found) should be cached for 60 minutes. You can adjust the status codes and durations as needed.proxy_cache_valid 404 1m
: Caches 404 (Not Found) errors for 1 minute. This can help reduce load on the backend server if it’s frequently returning 404s.proxy_cache_use_stale ...
: Configures Nginx to serve stale (expired) cache content in various error scenarios, such as when the backend server is unavailable.proxy_cache_background_update on
: Enables background updates of the cache. When a cached item expires, Nginx will serve the stale content to the client while simultaneously updating the cache in the background. This improves perceived performance.proxy_cache_lock on
: Prevents multiple clients from requesting the same uncached resource at the same time. This can help reduce load on the backend server during cache misses.add_header X-Cache-Status $upstream_cache_status
: Adds anX-Cache-Status
header to the response, indicating whether the content was served from the cache (HIT
), a cache miss occurred (MISS
), or the cache was bypassed (BYPASS
). This header is useful for debugging and monitoring cache performance.
Restart Nginx to apply the changes:
sudo systemctl restart nginx
Verify the cache status by sending a request and checking the
X-Cache-Status
header:curl -I http://localhost/
The output should include
X-Cache-Status: MISS
on the first request (cache miss) andX-Cache-Status: HIT
on subsequent requests (cache hit).
2. Using a Content Delivery Network (CDN):
-
Explanation: A CDN is a geographically distributed network of servers that caches content closer to users, reducing latency and improving load times. While Varnish sits in front of your origin server, a CDN distributes cached content across multiple servers around the world.
-
Implementation: Popular CDN providers include Cloudflare, Akamai, and AWS CloudFront.
- Sign up for a CDN service: Choose a provider and create an account.
- Configure your domain: Point your domain’s DNS records to the CDN provider’s servers.
- Configure caching rules: Define caching policies within the CDN’s control panel, specifying which content to cache and for how long. CDNs typically offer granular control over caching based on file type, URL patterns, and other criteria.
- Origin server configuration: Configure your Nginx server to work with the CDN. This may involve setting up specific headers or modifying your server configuration to prevent the CDN from caching certain content.
While a code example isn’t directly applicable to CDN configuration (as it primarily involves using the CDN provider’s web interface), an example of a header you might set in Nginx to control CDN caching is:
location /private-content { proxy_pass http://backend; proxy_cache_bypass $http_cookie; # Bypass cache if cookies are present add_header Cache-Control "private, no-cache, no-store, must-revalidate"; # Tell browsers and CDNs not to cache }
This configuration tells the CDN not to cache content in the
/private-content
location. TheCache-Control
header is important for instructing both browsers and CDNs.proxy_cache_bypass
prevents Nginx’s internal cache from being used as well.
Choosing between Varnish, Nginx’s built-in caching, and a CDN depends on your specific needs and the complexity of your application. Nginx caching is a good starting point for simple caching needs, while Varnish provides more advanced features and flexibility. A CDN is ideal for distributing content globally and handling high traffic loads. It is important to remember the steps to Install Varnish Cache with Nginx on Ubuntu 22.04 if you need advanced features for optimizing content.