DEV Community

Mike Rogers ✈️
Mike Rogers ✈️

Posted on • Originally published at on

How to Make Rails Work Offline (PWA)

I've been experimenting a lot lately with allowing Ruby on Rails to work offline, by this I mean having a sensible fallback for when the network unexpectedly drops out (e.g. the user is underground on a train).

The main way to achieve this is by making our app a Progressive Web App (PWA) via a Service Worker. In the past I've always associated PWA's with Single Page Applications and having to have a very JavaScript heavy codebase. However with tools such as Webpacker, we can add a Service Worker while keeping a traditional Ruby on Rails application approach (i.e. server side rendering & Turbolinks).


I've put together a few screencasts so you can see everything in action.

What is a service worker?

A Service Worker is a JavaScript file you serve to the browser, which will intercept future network requests to your website. The result is you can control how a request will react if the network is down, or if you want to always serve requests from the cache.

Service Worker Limitations

When researching this topic, I found Service Workers do have some drawbacks you should be aware of:

  • The URL of your service worker must stay the same (e.g. /service-worker.js), so it can be tricky to get it working with the Asset Pipeline & Webpacker.
  • If you serve your service worker from a different port (i.e. via bin/webpacker-dev-server) it won't intercept HTTP requests as you'd expect.
  • The amount of data you can cache is pretty varied between browsers & devices. I'd recommend keeping your usage under 25MB.


Service Workers have been around a few years, as a result there is quite a few libraries which make them a lot easier to work with. Here is a quick summary of the main ones to know about.

The serviceworker-rails Gem

The serviceworker-rails gem will work pretty nicely for most use cases, it works with the Asset Pipeline (Sprockets) & has a very nifty generator for automated setup.

The only downside of this approach is because it's using the Asset Pipeline, it defaults to a verbose vanilla JavaScript approach. This makes using the new libraries out there which can cut down some of the boilerplate a little tricky.

webpacker-pwa library

One of the biggest drawbacks with webpack is it's quite tricky to configure if you're not working with it regularly. The webpacker-pwa library makes adding the extra configuration a lot easier.

The awesome result of this library, is you can write your service workers JavaScript in modern JS, then it'll be served from your /public directory from a file that doesn't have a content hash.


The vanilla Service Worker JavaScript is pretty verbose. While I was initially exploring approaches to allowing Rails to work offline, I was finding the JavaScript was getting pretty hard to explain.

Then I was shown Workbox, which allows the Service Worker JavaScript to be boiled down to something more concise:

// app/javascript/service_workers/service-worker.js
import { registerRoute } from 'workbox-routing';
import { NetworkFirst, StaleWhileRevalidate, CacheFirst } from 'workbox-strategies';
import { CacheableResponsePlugin } from 'workbox-cacheable-response';
import { ExpirationPlugin } from 'workbox-expiration';

// Loading pages (and turbolinks requests), checks the network first
  ({request}) => request.destination === "document" || (
    request.destination === "" &&
    request.mode === "cors" &&
    request.headers.get('Turbolinks-Referrer') !== null
  new NetworkFirst({
    cacheName: 'documents',
    plugins: [
      new ExpirationPlugin({
        maxEntries: 5,
        maxAgeSeconds: 5 * 60, // 5 minutes
      new CacheableResponsePlugin({
        statuses: [0, 200],

// Load CSS & JS from the Cache
  ({request}) => request.destination === "script" ||
  request.destination === "style",
  new CacheFirst({
    cacheName: 'assets-styles-and-scripts',
    plugins: [
      new ExpirationPlugin({
        maxEntries: 10,
        maxAgeSeconds: 60 * 60 * 24 * 30, // 30 Days
      new CacheableResponsePlugin({
        statuses: [0, 200],
Enter fullscreen mode Exit fullscreen mode

I think this JavaScript is very approachable compared to the library free approach.


There are 3 main approach for caching and serving content which I settled on using.


This is kind of the best default choice for any page which might change between page loads.

As the name hints, it'll try to request the resource from the webserver (caching it if it's successful), or falling back to its cached copy if the server is unreachable.


This is the best choice for assets such a CSS, JavaScript & Images.

This approach will initially request the file, then cache the response. For subsequent requests it'll serve the cached file.


This is the quirky option! It serves the cached content, but then in the background it'll make a request to the server to update its cache.

Eager-Caching Assets

It's possible to preload assets into your cache. You can do this from within your service-worker.js, however I found I'd reach for mixing ERB & JavaScript when I took this approach. Instead I eager-cached my assets by parsing my DOM when the service worker was registered:

// app/javascript/service_workers/index.js
if ('serviceWorker' in navigator) {
  window.addEventListener('load', function() {
    navigator.serviceWorker.register('/service-worker.js', { scope: "/" })
      .then(function(registration) {
        console.log('[ServiceWorker Client]', 'registration successful with scope: ', registration.scope);

        registration.addEventListener('updatefound', function() {

          // Cache a few popular pages ahead of time.
'documents').then(function(cache) {
            let links = document.querySelectorAll('a[href^="/"]:not([rel="nofollow"])');
            cache.addAll( Array.from(links).map(elem => elem.getAttribute("href")) );
            cache.addAll( [document.location.pathname] );

          // Cache all the CSS & JS assets on the page.
'assets-styles-and-scripts').then(function(cache) {
            let stylesheetLinks = document.querySelectorAll('link[rel="stylesheet"][href^="/"]');
            cache.addAll( Array.from(stylesheetLinks).map(elem => elem.getAttribute("href")) );

            let scriptLinks = document.querySelectorAll('script[src^="/"]');
            cache.addAll( Array.from(scriptLinks).map(elem => elem.getAttribute("src")) );

      }, function(err) {
        console.log('[ServiceWorker Client]','registration failed: ', err);
Enter fullscreen mode Exit fullscreen mode

I didn't make a video on this approach as I wasn't able to validate anyone else doing it, but I did like it.


After I added a Service Worker to my Rails app, it was able to fallback to a read-only view when the network was down, This was pretty awesome! Especially as I didn't have to change any of my standard "Rails rendering the HTML" & Turbolinks making things feel a bit snappier approach.

I think most apps could benefit from a Service Worker being added for the small performance win it can offer, plus I think having a read-only fallback for if your server is unreachable a pretty cool trick.

One thing I didn't figure out, is how to detect if a response was returned from the cache on the current page, i.e. to show the user a notification saying "Hey, you're offline".

Top comments (0)