How I Made My Next.js App Work Offline (Vanilla SW)
Struggling with offline support in Next.js? Learn how to implement a vanilla Service Worker from scratch for full control and a blazing-fast, reliable PWA.
Alex Carter
Senior front-end developer specializing in performance optimization and Progressive Web Apps.
How I Made My Next.js App Work Offline (Vanilla SW)
There’s a certain magic to building a web application with Next.js. The developer experience is slick, performance is top-notch, and features like server-side rendering and static site generation feel like superpowers. But then, reality hits. A user hops on a spotty train connection, their network drops, and your beautiful, lightning-fast app is replaced by the dreaded “No Internet” dinosaur. All that magic, gone in an instant.
I hit this wall a few months ago. My portfolio site, built with Next.js, was fast and responsive—as long as you were online. I wanted it to be resilient, to offer a baseline experience even on a shaky connection or completely offline. The obvious path was to turn it into a Progressive Web App (PWA). While there are excellent libraries like next-pwa
that handle this automatically, I found myself wanting to understand the gears turning underneath. I wanted full control over caching strategies and a deeper knowledge of the technology that makes offline-first possible: the Service Worker.
So, I decided to go vanilla. This post is the story of how I manually implemented a Service Worker in my Next.js application from scratch. We’ll walk through every step, from registering the worker to defining caching strategies and managing updates. If you've ever been curious about what’s happening behind the PWA curtain or want fine-grained control over your app's offline behavior, you're in the right place.
Why Go Vanilla with Your Service Worker?
Before we dive into the code, let's address the elephant in the room: why not just use a library? Packages like next-pwa
are fantastic for getting a PWA up and running with minimal configuration. However, taking the manual route offers some distinct advantages.
Aspect | next-pwa (and similar libraries) | Vanilla Service Worker |
---|---|---|
Ease of Use | High. Minimal setup required in next.config.js . | Lower. Requires manual coding and understanding of the SW lifecycle. |
Control & Customization | Limited. You work within the options the library provides. | Total. You can implement any caching strategy imaginable (Cache First, Network First, Stale-While-Revalidate) on a per-request basis. |
Learning Curve | Low. You learn the library's configuration. | Steeper. You learn the actual Web API, a transferable skill. |
Debugging | Can be opaque. Is the issue in my config, the library, or the underlying Workbox? | Direct. You are debugging your own code, which can be more straightforward. |
For me, the primary motivator was control and learning. I wanted to decide precisely which assets were cached, when they were updated, and how to handle dynamic routes. Writing my own Service Worker forced me to learn the lifecycle (`install`, `activate`, `fetch`) and the Cache API, giving me a robust toolset for any future project.
The Setup: Placing Your Service Worker in Next.js
The first rule of Service Workers is about scope. A Service Worker can only control pages that are within its scope, which is determined by where the SW file is located on the server. To control your entire site (e.g., `yourapp.com/`), the Service Worker file must be served from the root directory.
In a Next.js project, the solution is simple: place your Service Worker file in the public
directory. Any file in `public` is served statically from the root of your domain.
So, let's create our file: public/sw.js
. For now, it can be empty. We'll populate it soon.
Step 1: Registering the Service Worker
A Service Worker is a JavaScript file that runs in the background, separate from your web page. To get it running, we need to register it from our client-side code. A great place to do this is in your main layout component or `_app.tsx` file, so it runs on every page load.
We'll use a `useEffect` hook to run this code on the client side after the component mounts.
// In pages/_app.tsx or a root layout component
import { useEffect } from 'react';
function MyApp({ Component, pageProps }) {
useEffect(() => {
if ('serviceWorker' in navigator) {
window.addEventListener('load', function() {
navigator.serviceWorker.register('/sw.js').then(
function(registration) {
console.log('Service Worker registration successful with scope: ', registration.scope);
},
function(err) {
console.log('Service Worker registration failed: ', err);
}
);
});
}
}, []);
return <Component {...pageProps} />;
}
export default MyApp;
Code Breakdown
if ('serviceWorker' in navigator)
: This is a feature check to ensure the browser supports Service Workers before we try to use the API.window.addEventListener('load', ...)
: We wait until the page has fully loaded before registering the Service Worker. This is a best practice to avoid delaying the initial page render with non-critical tasks.navigator.serviceWorker.register('/sw.js')
: This is the key command. It tells the browser to find the file at `/sw.js`, download it, and begin the installation process in the background. The returned Promise resolves when registration is successful.
Now, if you run your app and check the browser's developer tools (under Application > Service Workers), you should see `sw.js` listed as registered and activated!
Step 2: The 'Install' Event - Caching the App Shell
With our Service Worker registered, it's time to make it do something. The first lifecycle event we'll hook into is `install`. This event fires only once per Service Worker version, right after the browser downloads the file. It's the perfect opportunity to cache the essential static assets that make up your "App Shell"—the minimal HTML, CSS, and JavaScript needed to power the user interface.
Let's add the following to our public/sw.js
file:
// In public/sw.js
const CACHE_NAME = 'my-app-cache-v1';
const ASSETS_TO_CACHE = [
'/',
'/styles/globals.css', // Adjust to your actual CSS file
'/icons/icon-192x192.png', // Example icon
'/offline.html' // A fallback page for when all else fails
];
// The 'install' event is fired when the service worker is first installed.
self.addEventListener('install', (event) => {
console.log('[Service Worker] Install');
// We use event.waitUntil to ensure the service worker doesn't move on
// from the 'install' phase until the promise passed to it has resolved.
event.waitUntil(
caches.open(CACHE_NAME).then((cache) => {
console.log('[Service Worker] Caching app shell');
return cache.addAll(ASSETS_TO_CACHE);
})
);
});
Code Breakdown
CACHE_NAME
: We give our cache a versioned name. When we want to update our assets, we'll simply increment the version number in this string.ASSETS_TO_CACHE
: An array of URLs to pre-cache. We're starting with the root page (`/`), a global stylesheet, an icon, and a dedicated `offline.html` page (you'll need to create this simple HTML file in your `public` directory).self.addEventListener('install', ...)
: We listen for the `install` event. `self` refers to the Service Worker global scope.event.waitUntil(...)
: This method takes a promise and uses it to know how long installation takes, and whether it succeeded. If any of the files in `addAll` fail to download, the entire installation step fails.caches.open(CACHE_NAME).then(...)
: This opens a specific cache by name. The Cache API allows you to have multiple caches.cache.addAll(...)
: This fetches an array of URLs and stores the responses in the cache.
Step 3: The 'Fetch' Event - Intercepting Requests
This is where the offline magic happens. The `fetch` event fires every single time your application makes a network request—whether it's for a page, a script, an image, or an API call. We can intercept this event and decide how to respond.
Our strategy will be "Cache First, falling back to Network". For any given request, we'll first check if we have a valid response in our cache. If we do, we'll serve it immediately. If not, we'll proceed with the network request.
Add this to public/sw.js
:
// The 'fetch' event is fired for every network request.
self.addEventListener('fetch', (event) => {
console.log(`[Service Worker] Fetching resource: ${event.request.url}`);
// We use event.respondWith to hijack the request and provide our own response.
event.respondWith(
caches.match(event.request).then((cachedResponse) => {
// If a cached response is found, return it.
if (cachedResponse) {
console.log(`[Service Worker] Returning from cache: ${event.request.url}`);
return cachedResponse;
}
// If not in cache, perform the network request.
console.log(`[Service Worker] Fetching from network: ${event.request.url}`);
return fetch(event.request).catch(() => {
// If the network request fails (e.g., user is offline),
// return the offline fallback page for navigation requests.
if (event.request.mode === 'navigate') {
return caches.match('/offline.html');
}
});
})
);
});
Code Breakdown
event.respondWith(...)
: This method is our chance to take control. We pass it a promise that resolves with a `Response` object.caches.match(event.request)
: This checks all of your caches to see if any of them contain a stored response that matches the current request.return cachedResponse
: If a match is found, we return it instantly, without ever hitting the network. This is what makes offline access so fast.return fetch(event.request)
: If no match is found in the cache, we let the original network request proceed..catch(...)
: This is our crucial fallback. If the `fetch` fails (because the network is down), we check if it was a navigation request (`mode: 'navigate'`). If so, we serve our pre-cached `/offline.html` page, providing a much better user experience than a browser error.
Step 4: The 'Activate' Event - Cleaning Up Old Caches
What happens when you update your site and deploy a new Service Worker with an updated `CACHE_NAME` (e.g., `my-app-cache-v2`)? The old Service Worker and its cache (`v1`) will hang around until all tabs using it are closed. The `activate` event gives us a perfect moment to clean up.
This event fires after a new Service Worker has been installed and is ready to take control. We'll use it to delete any old, unused caches.
Add this final piece to public/sw.js
:
// The 'activate' event is fired when the new service worker takes control.
self.addEventListener('activate', (event) => {
console.log('[Service Worker] Activate');
const cacheWhitelist = [CACHE_NAME];
event.waitUntil(
caches.keys().then((cacheNames) => {
return Promise.all(
cacheNames.map((cacheName) => {
// If the cache name is not in our whitelist, we delete it.
if (cacheWhitelist.indexOf(cacheName) === -1) {
console.log(`[Service Worker] Deleting old cache: ${cacheName}`);
return caches.delete(cacheName);
}
})
);
})
);
});
Code Breakdown
- The `activate` listener fires, signaling the new Service Worker is in charge.
- We create a `cacheWhitelist` containing only the current, active `CACHE_NAME`.
- `caches.keys()` gives us an array of all cache names currently stored by our origin.
- We map over these names and delete any cache that isn't in our whitelist. This ensures we don't leave gigabytes of outdated assets on a user's device over time.
Putting It All Together: The Full Code
Here is the complete `public/sw.js` file for easy reference. You can use this as a starting point for your own project.
// public/sw.js
const CACHE_NAME = 'my-app-cache-v1';
const ASSETS_TO_CACHE = [
'/',
'/offline.html',
// Add your crucial assets here. Be mindful of Next.js's build outputs.
// For example, you might need to find a way to dynamically add the CSS/JS chunk names.
];
self.addEventListener('install', (event) => {
event.waitUntil(
caches.open(CACHE_NAME).then((cache) => {
console.log('[Service Worker] Caching app shell');
return cache.addAll(ASSETS_TO_CACHE);
})
);
});
self.addEventListener('activate', (event) => {
const cacheWhitelist = [CACHE_NAME];
event.waitUntil(
caches.keys().then((cacheNames) => {
return Promise.all(
cacheNames.map((cacheName) => {
if (cacheWhitelist.indexOf(cacheName) === -1) {
return caches.delete(cacheName);
}
})
);
})
);
});
self.addEventListener('fetch', (event) => {
event.respondWith(
caches.match(event.request).then((cachedResponse) => {
if (cachedResponse) {
return cachedResponse;
}
return fetch(event.request).catch(() => {
if (event.request.mode === 'navigate') {
return caches.match('/offline.html');
}
});
})
);
});
Final Thoughts and Next Steps
And there you have it! By manually writing a Service Worker, we’ve given our Next.js app a foundational layer of offline resilience. We’re no longer at the mercy of the network. Users can now load the basic shell of our app, see a helpful offline page, and access any previously visited pages that were cached on the fly.
This is just the beginning. The "Cache First" strategy we implemented is simple and effective, but you can now explore more advanced patterns:
- Stale-While-Revalidate: Serve content from the cache immediately for speed, but also fire off a network request in the background to update the cache with fresh content for the next visit.
- Network First: Try the network first and only fall back to the cache if the network fails. Ideal for content that changes frequently, like a social media feed.
- Dynamic Caching: Get more granular with what you cache from network responses. You could cache API calls with a specific lifetime or only cache images.
While libraries like `next-pwa` provide an incredible shortcut, taking the time to build a Service Worker from scratch demystifies the process and hands you the keys to building truly robust, offline-first web experiences. You now have the power to control every bit and byte of your app's caching behavior.
Happy coding, and may your apps never see the offline dinosaur again!