DEV Community

Rishi Raj Jain
Rishi Raj Jain

Posted on • Updated on • Originally published at rishi.app

Case Study: How Nike.com can leverage Layer0 to improve their First Page Loads & Navigation upto ~80%, acing Core Web Vitals.

With Layer0, nearly every website can boost their front-end performance. Easiest way to ace Largest Contentful Paint (part of Core Web Vitals)? Combine your frontend optimisations with the powerful caching and predictive prefetching offered by Layer0 by Limelight.

Introduction

A good user experience starts with delivering content as fast as possible. With the backbone of LLNW’s CDN, Layer0 promises super fast delivery, and with their predictive prefetching, a super fast navigation. In this study, I set out to improve nike.com! by caching their home page, product listing pages & product display pages. I aimed for at least 50% boost in the performance, but voila! seems like I was able to pull off nearly 80% performance improvement more than that over a cup of tea (only code haha).

Disclaimer

I’m a Solutions Engineer at Layer0, but this is purely my ideation & work.

Leveraging Layer0 WebApp CDN

The starting point to ship speed with Layer0 is at their WebApp CDN guide. Overall, Layer0 will become the main CDN for the site/origin server (here, www.nike.com). Once a response is received from the origin, it gets cached over Layer0's globally distributed edge network. In browser, then those cached pages, assets, api’s, etc. can be prefetched.

Untitled

Show me the code!

I started with installing the Layer0 CLI, by the following command:

npm i -g @layer0/cli # yarn global add @layer0/cli
Enter fullscreen mode Exit fullscreen mode

Creating a new project with Layer0, is then just a matter of command:

npx @layer0/cli@latest init
Enter fullscreen mode Exit fullscreen mode

Irrespective of the initial configuration by the CLI, I alter my project structure to look like below.

Project Structure

layer0.config.js: Controls how your apps run on Layer0. Also, here we define the origin/backend server.

— src/

browser.ts: Using this to install Layer0’s Prefetcher in browser window

cache.ts: Solely used for maintaining caching configuration constants

routes.ts: Using this file to define what shall be cached, and for how long using the constants from cache.ts

service-worker.ts: Once installed, start prefetching content by predicting what’s user gonna tap at

shoppingFlowRouteHandler.ts: Using this to abstract the process in a single template to fetch the upstream response from the origin/backend server

transform.ts: Once the response is proxied, exploiting transformResponse function by Layer0 to inject Layer0 specific JS to leverage prefetching, and make some front-end optimisations.

Now let’s do walk over on each of the files mentioned above.

layer0.config.js

In this file, I’ve defined two backends, www.nike.com as my origin, and static.nike.com as my assets backend. The idea is to proxy everything including assets and pages using both the backends.

// project-name/layer0.config.js

module.exports = {
  routes: './src/routes.ts',
  connector: '@layer0/starter',
  backends: {
        // Proxying origin
    origin: {
      domainOrIp: 'www.nike.com',
      hostHeader: 'www.nike.com',
      disableCheckCert: process.env.DISABLE_CHECK_CERT || true,
    },
        // Proxying assets of origin
    assets: {
      domainOrIp: 'static.nike.com',
      hostHeader: 'static.nike.com',
      disableCheckCert: process.env.DISABLE_CHECK_CERT || true,
    },
  },
}
Enter fullscreen mode Exit fullscreen mode

browser.ts

Using install by @layer0/prefetch module to install the service worker. I set includeCacheMisses to true as that prefetch the response even if it’s not cached at the time of fetching.

// project-name/src/browser.ts

import { install } from '@layer0/prefetch/window'

// install layer0 service worker
document.addEventListener('DOMContentLoaded', function () {
  // @ts-ignore
  install({
    includeCacheMisses: true,
        // Don't want to wait for the cache to get warm :P
  })
})
Enter fullscreen mode Exit fullscreen mode

cache.ts

I cache the pages on the edge and in only the browser’s service worker for an hour. forcePrivateCaching: true takes care of caching those pages where upstream returns a responseHeader of cache-contol: private, no-cache. Similarly, I’m able to cache assets for over a day at the edge & the service worker.

// project-name/src/cache.ts

const ONE_HOUR = 60 * 60
const ONE_DAY = 24 * ONE_HOUR

// The default cache setting for pages in the shopping flow
export const CACHE_PAGES = {
  edge: {
    maxAgeSeconds: ONE_HOUR,
    forcePrivateCaching: true,
  },
  browser: {
    maxAgeSeconds: 0,
    serviceWorkerSeconds: ONE_HOUR,
  },
}

// The default cache setting for static assets like JS, CSS, and images.
export const CACHE_ASSETS = {
  edge: {
    maxAgeSeconds: ONE_DAY,
    forcePrivateCaching: true,
  },
  browser: {
    maxAgeSeconds: 0,
    serviceWorkerSeconds: ONE_DAY,
  },
}
Enter fullscreen mode Exit fullscreen mode

shoppingFlowRouteHandler.ts

So assume that if a user comes looking for the homepage, then according to the configuration in routes.ts, it calls the RouteHandler defined in this file, and then fetches the upstream on the same route. Then, the response headers of set-cookie & content-security-policy header are removed. Finally, the response is transformed as per the transformResponse function as defined in transform.ts.

// project-name/src/shoppingFlowRouteHandler.ts

import { CACHE_PAGES } from './cache'
import transformResponse from './transform'
import { RouteHandler } from '@layer0/core/router/Router'

const handler: RouteHandler = async ({ cache, removeUpstreamResponseHeader, updateResponseHeader, setResponseHeader, proxy }) => {
  cache(CACHE_PAGES)
  removeUpstreamResponseHeader('set-cookie')
  removeUpstreamResponseHeader('cache-control')
  removeUpstreamResponseHeader('content-security-policy-report-only')
  removeUpstreamResponseHeader('content-security-policy')
  setResponseHeader('cache-control', 'public, max-age=86400')
  updateResponseHeader('location', /https:\/\/www\.nike\.com\//gi, '/')
  proxy('origin', { transformResponse })
}

export default handler
Enter fullscreen mode Exit fullscreen mode

routes.ts

I define all the possible paths to be cached (with the help of configurations defined in cache.ts), rest sent to the origin (the fallback).

An example of defining a route

Assume that on a website, an asset that has a relative url /l0-prodstatic/images/image/1.png is being fetched. The route below will consider the variable :path* to be images/image/1.png. Then Layer0 would fetch the :path* relative to the origin server as defined in the assets keys of backends in layer0.config.js. Once done, Layer0 would remove the set-cookie header, update the cache-contol response header and apply the cache timings.

.match('/l0-prodstatic/:path*', ({ cache, removeUpstreamResponseHeader, proxy, setResponseHeader }) => {
  setResponseHeader('cache-control', 'public, max-age=86400')
  removeUpstreamResponseHeader('set-cookie')
  cache(CACHE_ASSETS)
  proxy('assets', { path: ':path*' })
})
Enter fullscreen mode Exit fullscreen mode

The whole file

// project-name/src/routes.ts

import { CACHE_ASSETS } from './cache'
import { Router } from '@layer0/core/router'
import shoppingFlowRouteHandler from './shoppingFlowRouteHandler'

export default new Router()
  // L0 Service Worker
  .match('/service-worker.js', ({ cache, removeUpstreamResponseHeader, serveStatic, setResponseHeader }) => {
    setResponseHeader('cache-control', 'public, max-age=86400')
    removeUpstreamResponseHeader('set-cookie')
    cache(CACHE_ASSETS)
    serveStatic('dist/service-worker.js')
  })
  // L0 Browser.js
  .match('/__layer0__/:browser/browser.js', ({ cache, removeUpstreamResponseHeader, serveStatic, setResponseHeader }) => {
    setResponseHeader('cache-control', 'public, max-age=86400')
    removeUpstreamResponseHeader('set-cookie')
    cache(CACHE_ASSETS)
    serveStatic('dist/browser.js')
  })
    // Homepage
  .match('/', shoppingFlowRouteHandler)
  .match('/:locale', shoppingFlowRouteHandler)
  // PLP
  .match('/w/mens-shoes:path', shoppingFlowRouteHandler)
  .match('/:locale/w/mens-shoes:path', shoppingFlowRouteHandler)
  // PDP
  .match('/t/air-zoom:path/:suffix*', shoppingFlowRouteHandler)
  .match('/:locale/air-zoom:path/:suffix*', shoppingFlowRouteHandler)
  // Assets
  .match('/static/:path*', ({ cache, removeUpstreamResponseHeader, proxy, setResponseHeader }) => {
    setResponseHeader('cache-control', 'public, max-age=86400')
    removeUpstreamResponseHeader('set-cookie')
    cache(CACHE_ASSETS)
    proxy('origin')
  })
  .match('/assets/:path*', ({ cache, removeUpstreamResponseHeader, proxy, setResponseHeader }) => {
    setResponseHeader('cache-control', 'public, max-age=86400')
    removeUpstreamResponseHeader('set-cookie')
    cache(CACHE_ASSETS)
    proxy('origin')
  })
  // Assets from static.nike.com being served frm l0-prodstatic as modified in the transform.ts
  .match('/l0-prodstatic/:path*', ({ cache, removeUpstreamResponseHeader, proxy, setResponseHeader }) => {
    setResponseHeader('cache-control', 'public, max-age=86400')
    removeUpstreamResponseHeader('set-cookie')
    cache(CACHE_ASSETS)
    proxy('assets', { path: ':path*' })
  })
  // If not found at any of above, but is an asset, cache it.
  .match(
    '/:path*/:file.:ext(js|mjs|css|png|ico|svg|jpg|jpeg|gif|ttf|woff|otf)',
    ({ cache, removeUpstreamResponseHeader, proxy, setResponseHeader }) => {
      setResponseHeader('cache-control', 'public, max-age=86400')
      removeUpstreamResponseHeader('set-cookie')
      cache(CACHE_ASSETS)
      proxy('origin')
    }
  )
    // Everything else to origin
  .fallback(({ proxy }) => {
    proxy('origin')
  })
Enter fullscreen mode Exit fullscreen mode

service-worker.ts

I define all the possible paths to be cached (with the help of configurations defined in cache.ts), rest sent to the origin (the fallback). Let’s look at this specific code from the whole file:

An example of defining what to be prefetched

Assume that I’m on the home page, and as soon as a link to my defined product listing page is being prefetched, apart from pre-fetching the HTML of that page, the service worker in the background will read what’s in the page, and identify those HTML elements that have an attribute of l0 set to true. Once identified, the callback function consumes the href attribute and starts prefetching those elements whether CSS, JS, Image, Asset, HTML, and stores it into the browser for future calls. Now you might wondering, how do we inject that? That’s addressed in the transform.ts, time to scroll through.


new Prefetcher({
  plugins: [
    new DeepFetchPlugin([
            {
        selector: '[l0="true"]',
        maxMatches: 3,
        attribute: 'href',
        as: 'image',
        callback: deepFetchImage,
      }
        ])
    ]
})
    .route()
    .cache(/^https:\/\/(.*?)\.com\/.*/)

function deepFetchImage({ $el, el, $ }: DeepFetchCallbackParam) {
  let urlTemplate= $(el).attr('href')
  console.log($(el), urlTemplate)
  if (urlTemplate) {
    console.log(`\n[][][][]\nPrefetching PDP: ${urlTemplate}\n[][][][]\n`)
    prefetch(urlTemplate, 'image')
  }
}
Enter fullscreen mode Exit fullscreen mode

The whole file

// project-name/src/service-worker.ts

import { skipWaiting, clientsClaim } from 'workbox-core'
import { Prefetcher, prefetch } from '@layer0/prefetch/sw'
import DeepFetchPlugin, { DeepFetchCallbackParam } from '@layer0/prefetch/sw/DeepFetchPlugin'

skipWaiting()
clientsClaim()

new Prefetcher({
  plugins: [
    new DeepFetchPlugin([
      {
        selector: 'script',
        maxMatches: 10,
        attribute: 'src',
        as: 'script',
        callback: deepFetchJS,
      },
      {
        selector: '[rel="stylesheet"]',
        maxMatches: 10,
        attribute: 'href',
        as: 'style',
        callback: deepFetchLinks,
      },
      {
        selector: '[rel="preload"]',
        maxMatches: 10,
        attribute: 'href',
        as: 'style',
        callback: deepFetchLinks,
      },
      {
        selector: '[l0="true"]',
        maxMatches: 3,
        attribute: 'href',
        as: 'image',
        callback: deepFetchImage,
      },
    ]),
  ],
})
  .route()
    // Cache assets from static.nike.com once in the browser
  .cache(/^https:\/\/static\.nike\.com\/.*/)

function deepFetchImage({ $el, el, $ }: DeepFetchCallbackParam) {
  urlTemplate = $(el).attr('href')
  console.log($(el), urlTemplate)
  if (urlTemplate) {
    console.log(`\n[][][][]\nPrefetching PDP: ${urlTemplate}\n[][][][]\n`)
    prefetch(urlTemplate, 'image')
  }
}

function deepFetchLinks({ $el, el, $ }: DeepFetchCallbackParam) {
  var urlTemplate = $(el).attr('href')
  console.log($(el), urlTemplate)
  if (urlTemplate) {
    console.log(`\n[][][][]\nPrefetching Links: ${urlTemplate}\n[][][][]\n`)
    prefetch(urlTemplate, 'script')
  }
}

function deepFetchJS({ $el, el, $ }: DeepFetchCallbackParam) {
  var urlTemplate = $(el).attr('src')
  console.log($(el), urlTemplate)
  if (urlTemplate) {
    console.log(`\n[][][][]\nPrefetching JS: ${urlTemplate}\n[][][][]\n`)
    prefetch(urlTemplate, 'script')
  }
}
Enter fullscreen mode Exit fullscreen mode

transform.ts

The transformResponse function called by the shoppingFlowRouteHandler.ts is used to modify the HTML response before sending it to the users. With the function injectBrowserScript one is able to refer to the compiled browser.ts. If there’s a response body, I assumed it had valid HTML, and then parsed it into a cheerio object. Then comes the part of front-end optimisation on the fly. Think of the approach in the implementation as serving pages optimised for performance, and then hydrating the page as required with JS. I applied lazy loading to every image on the page. Then I detect if the page is a PLP or PDP. In either case, I select the element that is critical to LCP, and then add the preload for it, as well as, remove lazy loading for that particular element.

// project-name/src/transform.ts

import cheerio from 'cheerio'
import Request from '@layer0/core/router/Request'
import Response from '@layer0/core/router/Response'
import { injectBrowserScript } from '@layer0/starter'

export default async function transformResponse(response: Response, request: Request) {
  // inject browser.ts into the document returned from the origin
  injectBrowserScript(response)

  if (response.body) {
    let $ = cheerio.load(response.body)

    // cheerio.load(response.body)
    console.log(`Transform script running on ${request.url}`)

    // For production this script should be included in original website base code.
    // <script defer src="/__layer0__/devtools/install.js"></script>

    $('head').append(`
      <script defer src="/__layer0__/cache-manifest.js"></script>
    `)

    // Load every other image lazily to avoid unnecessary initial loads on the page
    $('img').each((i, el) => {
      $(el).attr('loading', 'lazy')
    })

    // First image on PLP to load as soon as possible, preload for faster first load
    if (request.path.includes('/w/')) {
      $('.product-card__body noscript').each((i, el) => {
        if (i < 1) {
          let ele = $(el)
          let hml = $(el).html()
          if (ele && hml) {
            let img = cheerio.load(hml)
            $('.product-card__body img').first().removeAttr('loading')
            $('.product-card__body img').first().attr('src', img('img').attr('src'))
                        // Preload image, and add an attribute to ensure easy prefetch
            $('head').prepend(`<link l0="true" rel="preload" as="image" href="${img('img').attr('src')}" />`)
          }
        }
      })
    }

    // First image on PDP to load as soon as possible, preload for faster first load
    if (request.path.includes('/t/')) {
      let img = ''
      $('img.u-full-height').each((i, el) => {
        if (i == 1) {
          img = $(el).attr('src') || ''
                    // Preload image, and add an attribute to ensure easy prefetch
          $('head').prepend(`<link l0="true" rel="preload" as="image" href="${img}" />`)
        }
      })
      $('img.u-full-height').each((i, el) => {
        if (i == 0) {
          $(el).removeAttr('loading')
          $(el).removeAttr('data-fade-in')
          $(el).attr('src', img)
        }
      })
    }

    response.body = $.html()
            // Replace display: none; with {}
      .replace(/\{ display\: none\; \}/g, '{}')
            // Replace opacity: 0; with nothing
      .replace(/\opacity\: 0\;/g, '')
            // Replace =" with ="https://
      .replace(/\=\"\/\//g, '="https://')
            // Replace all https://www.nike.com with /
      .replace(/https:\/\/www\.nike\.com\//g, '/')
            // Replace all https://static.nike.com with /l0-prodstatic/
      .replace(/https:\/\/static\.nike\.com\//g, '/l0-prodstatic/')
            // Replace all ?layer0_dt_pdf=1 with nothing
      .replace(/\?layer0\_dt\_pf\=1/g, '')
  }
}
Enter fullscreen mode Exit fullscreen mode

We’re done! Let’s deploy fearlessly now.

Emulating the production experience, locally

With Layer0’s CLI, it is possible to emulate edge rules locally as if the code went live. Here’s how I did it:

layer0 build && layer0 run --production # 0 build && 0 run -p
Enter fullscreen mode Exit fullscreen mode

Deploy

Deploying from CLI can be done as mentioned in Layer0 docs

layer0 deploy # 0 deploy
Enter fullscreen mode Exit fullscreen mode

Results

If you’ve come this far (awesome!) you probably want to know the result. Tbh, I was surprised with the results too. Upto 73% improvement on first page loads & 80% improvement on navigation!

PLP First Load, Improvement ~73%

(PLP First Load, LCP: [Nike.com: 4.8s](https://webpagetest.org/result/220206_AiDcZN_CY3/), Optimised [with Layer0: 1.3s](https://webpagetest.org/result/220206_AiDcQ6_CY6/), Improvement: ~73%)

(PLP First Load, LCP: Nike.com: 4.8s, Optimised with Layer0: 1.3s, Improvement: ~73%)

PDP First Load, Improvement ~55.55%

(PDP First Load, LCP: [Nike.com: 4.5s](https://webpagetest.org/result/220206_AiDcS7_CYC/), Optimised [with Layer0: 1.9s](https://webpagetest.org/result/220206_AiDcXR_CYE/), Improvement: ~55.55%)

(PDP First Load, LCP: Nike.com: 4.5s, Optimised with Layer0: 1.9s, Improvement: ~55.55%)

Home to PLP Navigation, Improvement ~80%

(Home to PLP Navigation, LCP: [Nike.com: 5.4s](https://webpagetest.org/result/220206_BiDc16_C3M/), Optimised [with Layer0: 1.1s](https://webpagetest.org/result/220206_BiDcRC_C3J/), Improvement: ~80%)

(Home to PLP Navigation, LCP: Nike.com: 5.4s, Optimised with Layer0: 1.1s, Improvement: ~80%)

PLP to PDP Navigation, Improvement ~60%

(PLP to PDP Navigation, LCP: [Nike.com: 2.7s](https://webpagetest.org/result/220206_BiDcB8_C46/), Optimised [with Layer0: 1.1s](https://webpagetest.org/result/220206_BiDc21_C43/), Improvement: ~60%)

(PLP to PDP Navigation, LCP: Nike.com: 2.7s, Optimised with Layer0: 1.1s, Improvement: ~60%)

Video: Home to PLP Navigation Comparison


(Home - PLP Navigation Experience Comparison Recording via WebPageTest)
(5.4s on Nike.com vs 1.1s on Nike.com demo on Layer0)

Discussion

My implementation aggressively focuses on optimising what I set out for testing: First Page Loads & LCP. I do that with Layer0’s Caching of more than just assets, and Layer0’s prefetching to serve users as if the whole website in running on their browser itself. I also did some optimisation in the frontend to prioritise the assets and page load. Definitely, this approach is not set in stone, but seems like nike.com can deliver a way better online shopping experience by utilising caching and prefetching.

Code

https://github.com/rishi-raj-jain/improve.nike.com

Top comments (0)