Skip to main content

Caching strategies

Original source (Apache 2.0 License). Adapted for Serwist’s usage.

Introduction

A caching strategy is a pattern that determines how a service worker generates a response after receiving a fetch event.

Serwist provides a number of strategies out of the box, and you can write your own strategies tailored to your use cases.

Built-in strategies

These strategies are built into Serwist:

  • StaleWhileRevalidate — Uses a cached response for a request if it’s available and updates the cache in the background with a response from the network. Therefore, if the asset isn’t cached, it will wait for the network response and use that. It’s a fairly safe strategy, as it regularly updates cache entries that rely on it. The downside is that it always makes a network request in the background.
  • NetworkFirst — Tries to get a response from the network first. If a response is received, it passes that response to the browser and saves it to a cache. If the network request fails, the last cached response will be used, enabling offline access to the asset.
  • CacheFirst — Checks the cache for a response first and uses it if available. If the request isn’t in the cache, the network is used and any valid response is added to the cache before being passed to the browser.
  • NetworkOnly — Forces the response to come from the network.
  • CacheOnly — Forces the response to come from the cache.

Using plugins

See Using plugins.

Creating a new strategy

The following is an example of a strategy that reimplements the behavior of NetworkOnly:

import { Strategy, type StrategyHandler } from "serwist";

class NetworkOnly extends Strategy {
  _handle(request: Request, handler: StrategyHandler) {
    return handler.fetch(request);
  }
}

Notice how handler.fetch() is called instead of the native fetch() method. See the reference documentation on StrategyHandler for more details.

The following example is an example of a more complex strategy that uses multiple actions. It is based on cache-network-race from the Offline Cookbook, but goes a step further and always updates the cache after a successful network request.

import { Strategy, type StrategyHandler } from "serwist";

class CacheNetworkRace extends Strategy {
  _handle(request: Request, handler: StrategyHandler) {
    const fetchAndCachePutDone = handler.fetchAndCachePut(request);
    const cacheMatchDone = handler.cacheMatch(request);

    return new Promise<Response>((resolve, reject) => {
      fetchAndCachePutDone.then(resolve);
      cacheMatchDone.then((response) => {
        if (response) {
          resolve(response);
        }
      });

      // Reject if both network and cache error or find no response.
      Promise.allSettled([fetchAndCachePutDone, cacheMatchDone]).then((results) => {
        const [fetchAndCachePutResult, cacheMatchResult] = results;
        if (fetchAndCachePutResult.status === "rejected" && (cacheMatchResult.status === "rejected" || !cacheMatchResult.value)) {
          reject(fetchAndCachePutResult.reason);
        }
      });
    });
  }
}

Using a strategy in your custom fetch logic

If you do not want to use Serwist’s built-in routing, you can also use the strategies in your own fetch event listener:

import { StaleWhileRevalidate } from "serwist";

declare const self: ServiceWorkerGlobalScope;

const swr = new StaleWhileRevalidate();

self.addEventListener("fetch", (event) => {
  const { request } = event;
  const url = new URL(request.url);

  if (url.origin === location.origin && url.pathname === "/") {
    event.respondWith(swr.handle({ event, request }));
  }
});