A simple implementation of stale-while-revalidate caching using async/await

via GIPHY

New is not always better! Returning cached responses can improve app speed and resilience. Keep reading to find out why and how it works.

Recently I faced the problem to write a service worker that implements a very basic stale-while-revalidate caching strategy. Searching through the web there are many proposed implementations of this pattern, but I found none that uses state-of-the-art, low overhead async/await. Personally I cannot stand looking at 3+ nested .then statements, especially as async functions are supported by all relevant browsers in 2021. For this post I consciously formatted the code in the most readable way. Of course there are some opportunities to minify. Here is the code:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
async function fetchAndCacheIfOk(event) {
try {
const response = await fetch(event.request);

// don't cache non-ok responses
if (response.ok) {
const responseClone = response.clone();
const cache = await caches.open("my-cache-v1");
await cache.put(event.request, responseClone);
}

return response;
} catch (e) {
return e;
}
}

async function fetchWithCache(event) {
const cache = await caches.open("my-cache-v1");
const response = await cache.match(event.request);
if (!!response) {
// it is cached but we want to update it so request but not await
fetchAndCacheIfOk(event);
// return the cached response
return response;
} else {
// it was not cached yet so request and cache
return fetchAndCacheIfOk(event);
}
}

function handleFetch(event) {
// only intercept the request if there is no no-cache header
if (event.request.headers.get("cache-control") !== "no-cache") {
// important: respondWith has to be called sync, otherwise
// the service worker won't know whats going on.
// Had to learn this the hard way
event.respondWith(fetchWithCache(event));
}
}

self.addEventListener("fetch", handleFetch);

The implementation satisfies the tree core requirements I needed to meet:

  • Intercepts fetch requests and instantly serves a cached response if one is available
  • Requests and a fresh response every time it intercepts fetch and saves it to the CacheStorage
  • Does not cache non-ok responses (everything that is 200-299, feel free to change this if it does not suit your needs)

The main benefits of this strategy and also the reason why I chose it is that it increases resilience of the application with regard to some vital endpoints occasionally not being available. In my specific case it is used to cache requests that are needed on app-init and that would cause the init process to fail if unavailable. A pre-caching strategy is not possible it this specific scenario because the requests contain authentication and can vary from user to user.

As a side effect, the responses of the cached requests are served almost instantaneous. In my case this decreased the average startup time of the app a bit, which is nice.

Feel free to use, extend or improve this basic implementation in any of your projects!