Netlify Durable Cache: Caching for a third-party world
Since its release, a lot has been said about how Netlify’s Durable Cache works wonders for performance based on the benefits of how caching and CDNs work in general.
Keeping your content and data stored close to your users and not making expensive database calls will of course always result in faster websites, but we also live in a world of APIs: the composable web.
In the composable web, our performance is only as good as our weakest API… or at least it COULD be good. It could also be bad.
In this tutorial, we’ll take a look at setting up Netlify’s Durable Cache on a third-party API with (artificially) slow returns. We’ll use a proxied dev.to API, throttle it a bit, and see the effects of durable caching on the responses. We’ll also build a function to clear specific authors from the cache to get the most up-to-date information without a full rebuild.
Want to skip ahead and just examine the code? No problem, take a look at the GitHub repository to see what’s happening. Or see the final demo to see the endpoints.
Requirements
- Netlify Account
- Netlify CLI
- Federate This mock API
- Basic knowledge of Node and Netlify
Setup
In order to follow along with the project, you’ll need a blank project set up in Netlify. To do that, let’s create a simple directory and initialize a package.json:
mkdir netlify-durable-cache
cd netlify-durable-cache
npm init -y
// Install functions for the typescript types (and more later)
npm install @netlify/functions
This will give us the basic structure for the project and let us initialize a git repository to connect to Netlify.
Create the GitHub repository and run netlify init
in the directory to connect it to a project in Netlify (necessary for functions to work properly with netlify dev
local environments.
Writing our function
Now that the project is setup, let’s construct a few functions. To start, let’s write a function that will have no caching as a baseline. This will be the basis for the cached function, as well.
Create a file at /netlify/functions/demo/index.mts
.
While we could run this at the default endpoint structure for Netlify functions, it’s much cleaner to set up a redirect for these. Luckily, we can export a config object from our function file and set up the redirect AND any URL parameters we want to pass.
import { Config } from "@netlify/functions";
export const config: Config = {
path: "/api/demo/:username"
};
From here, let’s set up the meat of the function. We want to reach out to the proxied API and return a list of articles for a given username from Dev.to. Let’s write a function we can use in our handler for this:
async function getArticles(username: string) {
// Artificial slow API
await new Promise(resolve => setTimeout(resolve, 2000));
// API that returns articles from a proxy API to Dev.to
const response = await fetch(`https://www.federatethis.com/api/devto/articles/${username}`);
const data = await response.json();
return data;
}
One slightly non-standard piece here: I’ve added a Promise that we’re going to await it’s resolution. This will pause our API call for 2 seconds. Not ideal for the real world, but for testing caching, this is very beneficial.
Once we have this function, we can use it in our exported event handler.
import { Config, Context } from "@netlify/functions";
async function getArticles(username: string) {
// Artificial slow API
await new Promise(resolve => setTimeout(resolve, 2000));
// API that returns articles from a proxy API to Dev.to
const response = await fetch(`https://www.federatethis.com/api/devto/articles/${username}`);
const data = await response.json();
return data;
}
export default async (req: Request, context: Context) => {
const { username } = context.params;
const articles = await getArticles(username);
const response = new Response(JSON.stringify({
articles,
username
}));
return response;
};
export const config: Config = {
path: "/api/demo/:username"
};
Our handler will use the context
object that Netlify creates for us. This will allow us to use the params
to get our username from the URL. Then, we can pass that to our getArticles
function and return a response with the articles and the username.
If you’re using the Lambda compatibility mode, the context
object won’t contain the custom Netlify properties, so you won’t have access to the URL parameters and will need to deconstruct the URL to get the path.
In your terminal, run netlify dev
and your server will spin up. You can head to localhost:8888/api/demo/brob
to get a list of my posts (or input your own username for a list of your dev.to posts.
Now, this works, but... our API is soooooo slow! Each request is taking at least 2.3 seconds. This API is untrustworthy and should be cached aggressively to make this a great experience for most of our users.
Enter Netlify’s Durable Cache and the Netlify cdn-cache-control
NPM package.
Setting up caching
As you may have guessed, the first step here is to install the package I just mentioned. While you can write these headers by hand… well, noboday has time for all that. Netlify provides some great defaults out of the box with the package, so let’s install it and use it!
npm install cdn-cache-control
Once the package is installed we can make full use of it. In our default function export, we can create new cache headers using the package and pass that to our response. This will tell the browser important information, but will also clue Netlify’s CDN into the plan, as well.
import { Config, Context } from "@netlify/functions";
import { CacheHeaders } from "cdn-cache-control";
async function getArticles(username: string) {
// Artificial slow API
await new Promise(resolve => setTimeout(resolve, 2000));
// API that returns articles from a proxy API to Dev.to
const response = await fetch(`https://www.federatethis.com/api/devto/articles/${username}`);
const data = await response.json();
return data;
}
export default async (req: Request, context: Context) => {
const { username } = context.params;
const articles = await getArticles(username);
const headers = new CacheHeaders();
const response = new Response(JSON.stringify({
articles,
username
}), {
headers
});
return response;
};
export const config: Config = {
path: "/api/demo/:username"
};
With a few tweaks, we now have multiple headers being added to our response. Let’s take a look at that:
'netlify-cdn-cache-control' => {
name: 'Netlify-CDN-Cache-Control',
value: 'public,s-maxage=31536000,must-revalidate,durable'
},
'cache-control' => {
name: 'Cache-Control',
value: 'public,max-age=0,must-revalidate'
}
},
The package is adding both standards-compliant headers for the browser with the cache-control
header, but also the netlify-cdn-cache-control
header to communicate with the CDN.
The CDN cache is being set to a year, and is set to durable
.
While I’ll colloquially refer to this as “the CDN’s cache”, what’s really happening is a little deeper. Setting the header to durable
informs Netlify’s CDN that it needs to store it in their new shared cache called the “durable cache”. This is shared by all CDN nodes allowing for even faster response times for the nodes to cache the content locally.
This article in the documentation provides a great overview of the durable cache mechanism. If you just want to refer to it as your “CDN’s cache”, though, I won’t tell anyone.
The browser cache is being set to revalidate and always send the request. We could have a deeper conversation here about the potential for adding a client-side cache to this as well, but the CDN response is incredibly fast, so the gains are a fun philosophical debate, but maybe not worth it for this article. The CDN response is now cached for a year (or until cleared), so when the browser sends the request, it can respond immediately.
In order to truly see the cache in action, you’ll want to deploy it to Netlify.
Once deployed, hit the endpoint at the URL Netlify provides. The first response will take approximately 2-3 seconds. This will fire the function, retrieve the data, store it in that CDN node’s local cache and store it in the durable cache. Every response after will then be served from the original node cache for you AND from the durable cache for others. Each request takes approximately 40-70 milliseconds (at least in my experience). That’s notable savings from a problematic API.
But, as they say, there are 2 big problems in programming: naming things and invalidating caches. If this is stored for a year, how do we invalidate and create fresh content on the CDN?
Let’s purge the cache!
Purging cache tags
To make the cache purgeable, we need to ensure it’s named. Both standard headers and Netlify’s headers provide a mechanism for tagging caches with a variety of cache tags. This gives you an easy way to purge specific tags at any given time.
Let’s modify the function to use this functionality in the headers it’s sending.
// ...
const headers = new CacheHeaders().tag("articles", "articles-" + username);
// ...
Our CacheHeaders() method is fully chainable meaning you can continue to chain new methods on to it allowing new functionality to surface. In this case, we can use the .tag()
method and pass it as many tags as we want. For this example, let’s tag our caches with both articles
and articles-<username>
. In this case, if you are following along, this would tag the cache with articles
and articles-brob
.
Both tags may be something you want to purge at some point. The specific articles-brob
tag will clear the cache for just the endpoint that specified the username as brob
. The general articles
tag will clear the cache for all instances of the tag articles
which will be all the endpoints in our demo. Usually, we’ll want to clear just one cache, but there are definitely reasons for doing a bigger clearing.
Setting up a cache purging serverless function
In order to clear a given tag, we need a function to clear it. Let’s keep this separate and make it a utility function for our site.
Set up a new file at /netlify/functions/purge-cache/index.mts
. Once created, we’ll set up a config to locate this function at /api/purge-cace/:tag
and write a function that uses Netlify’s purgeCache
method from the @netlify/functions
package (not JUST for the types after all).
import { Context, Config } from "@netlify/functions";
import { purgeCache } from "@netlify/functions";
export default async (req: Request, context: Context) => {
const { tag } = context.params;
// if no tag, no need for further action
if (!tag) {
return;
}
console.log("Purging cache for ", tag);
await purgeCache({
tags: [tag],
});
return new Response("Purged!", { status: 202 })
};
export const config: Config = {
path: "/api/purge-cache/:tag"
};
That function will take the tag you enter and use the default method to tell Netlify’s caches to be purged. It shouldn’t be this easy, but it is. You’ll also notice that it takes an array, so you could relatively simply change this function to clear an array of tags instead of just a single tag. Consider that your home work.
Now, deploy the function.
Deploys ALSO invalidate caches
Keep in mind that when a new build happens, Netlify will invalidate the caches, as well. If you want to see the purge in action, you’ll need to recreate your cache by visiting the original endpoint.
If you visit /api/purge-cache/articles-brob
it will clear the specific cache for my articles. The next time you visit your original endpoint, it should take the full load again (assuming no one else hits your endpoint before you!).
This function could be used in conjunction with webhooks or other communication methods to help keep things fresh at just the right time instead of continuously updating at a predetermined time.
Conclusion
In this tutorial, we've explored how to leverage Netlify's Durable Cache to significantly improve the performance of third-party API calls. We started with a slow, unreliable API and transformed it into a lightning-fast, cached response system. Here's a quick recap:
- Set up a serverless function to proxy API calls to Dev.to (with an artificial delay)
- Implemented Netlify's Durable Cache using the
cdn-cache-control
package - Added cache tags for granular cache management
- Created a cache purging function for on-demand cache invalidation
By setting this up, we've created a robust caching strategy that can drastically reduce load times from seconds to milliseconds. This approach not only improves user experience but also reduces the load on third-party services and our own infrastructure.
Remember, while caching is powerful, it's important to balance freshness with performance. The cache purging function we created allows for this balance, enabling you to update content when needed without sacrificing the speed benefits of caching.
As you build your own projects, consider how you can apply these caching strategies to optimize your third-party integrations and create faster, more responsive web applications.