DEV Community

Cover image for DIY Pre-Loading for Faster Data Fetching
Jamund Ferguson
Jamund Ferguson

Posted on

DIY Pre-Loading for Faster Data Fetching

If you're struggling with slow page load times in your React app, I want to show you a technique that can shave off hundreds of milliseconds.

Here's how you can pre-load your data in the most optimal way without relying on any external dependencies.

First, the problem

We can break down the problem into three parts:

  1. React applications often end up in large JavaScript bundles
  2. Many components rely on external data
  3. Data fetching won't usually start until your JS bundle finishes downloading and your components finish rendering

Here's a simple chart to help you visualize the problem. Look at how much happens before the data starts downloading.

A timeline of critical elements for rendering your page showing html then js download then processing then data starting to load. following that we render data and then the page is ready.

What we noticed in our app at Amazon was that components would fire off data fetching in a useEffect while deeply nested in the component tree. That meant the fetch wouldn't start until between 50ms and 250ms after our ReactDOM.render() call. Furthermore, our huge JavaScript bundle took an additional 350ms (or more) to download and execute. Combining these together we saw a huge opportunity for improvement.

Measuring the Problem

The Chrome Web Inspector provides a number of tools that should make it easy to figure out if you're affected by this problem.

Check the Network Tab

First, find your main fetch call inside of the Network tab. Then go to the Timing section and look for "Started at". This shows how long it took us to send off our request after the page is loaded. You want this number to be as low as possible.

Reading the timing section of the network tab

Dive into the Performance Timeline

Now run your app in the web performance inspector. Look at it carefully and see if you can recognize the problem:
Chrome Performance Inspector

What you want to look for is your main app file and your main data fetch call. Here our app is bundled in a file called vendor.f2843ed7.js and we're fetching data from /api/slow/data. In this contrived example it takes around 200ms between the time vendor.js starts downloading and the time our fetch call begins.

Time wasted before fetching critical data

The chart above highlights two specific blocks of time that we can mitigate to optimize performance of our data loading.

The Solution

The solution we came up with could be broken up into two parts:

  1. Kick off data fetching as early in our script as possible (i.e remove it from the React component lifecycle)
  2. Parallelize data fetching with loading our JavaScript bundle

In order to accomplish the first of these we need some kind of global store. It doesn't need to be anything too fancy. In our case, we were already using redux, which we were able to dispatch actions to outside of the React tree as I'll demonstrate below.

In their simplest form most of the network-dependent components looked something like this:

// a simplified data loading example
function PageLevelComponent() {
   const dispatch = useDispatch();
   useEffect(() => {
       loadData().then((data) => dispatch(dataLoaded(data)))
   }, []);
   // ...
Enter fullscreen mode Exit fullscreen mode

We ended up moving this loadData() call into our root app file. The same one that starts rendering the react component tree. You'll notice we're still relying on redux to store the data, but we reference the store directly for our dispatch method instead of getting it from context or hooks.

import { store } from "./store"

// start loading data immediately and dispatch it to the redux store
loadData(location.pathname).then((data) => store.dispatch(dataLoaded(data)));

// render the application with the same redux store
ReactDOM.render(rootEl, <Provider store={store}><App /></Provider>);
Enter fullscreen mode Exit fullscreen mode

After making that change you'll see that the data starts downloading only shortly after the JS starts executing. There's no longer a large delay.

Data starts loading early in app execution

With this in place we asked ourselves if we could take it even further. The time to load our large JS bundle was clearly limiting how soon we were able to fetch our data. No matter how early we fired off the network request, the bundle still had to be downloaded and parsed before it was executed. Would it be possible to load our data in parallel with our JS somehow?

The Optimization:

Taking it to the next level required several clever steps to execute properly. First, we had to create a new entry in our webpack config. We called it preload.js. That preload.js needed to be as small as possible. Ideally no Redux, no Axios, etc.

entry: {
    "main": "index.js",
    "preload": "preload.js"
Enter fullscreen mode Exit fullscreen mode

At this time we were still supporting IE11, which meant we would likely need to include a promise polyfill, a fetch polyfill of some kind and URLSearchParams. In our case we were using Axios and ran into trouble when we didn't include that in both bundles, because of slightly different error handling and promise implementations. All of that ended up bumping our preload file to around 11kb minified.

The contents of preload.js looked something like this:

import "./polyfills.js";
import { loadData } from "./xhr.js"

// kick off the promise and cache it in a global variable
window.__DATA_LOADER_CACHE__ = loadData(location.pathname);
Enter fullscreen mode Exit fullscreen mode

And then in our main bundle later we would check for the presence of that global variable and if it exists use that instead of our loadData() method.

(window.__DATA_LOADER_CACHE__ || loadData(location.pathname)).then((data) => store.dispatch(dataLoaded(data)));
Enter fullscreen mode Exit fullscreen mode

We built the preload script in such a way that it would be completely optional. If it didn't run at all the normal app could continuing running properly. But that did result in a handful of modules being duplicated. With a little more care we probably could have gotten the script down to around 2kb. Despite it not being perfectly tiny, the results were tremendous:

Time savings

Your data becomes available to your application as soon as it's needed. And even in the case that your data call is still outstanding when the app is ready to go, your app will re-render as soon as it's done downloading. It's a much better user experience and the only trade-off is a tiny bit of awkward code.

How'd it turn out?

🏆 In the app we applied this to at Amazon our 90th percentile Time to Interactive went down by over 350ms. A huge savings for very little effort. I definitely recommend you figure out how to pre-load data in your application as well.


  1. Check out Ryan Florence's When to Fetch talk for a more elegant solution for faster data loading
  2. We ended up making a cache based on URL and query params and stuck that in the global variable including other data like any errors, etc.
  3. It's important to log if you end up fetching the data URL twice, which can happen if you incorrectly duplicate your URL parsing logic 😬
  4. I tried to reproduce this in vite but couldn't quite figure out how to split out the preload file. When I figure it out I'll post a demo of all 3 states.
  5. Can't we just use link rel="preload" as="fetch" and call it good? I mean yes, try that way first! We couldn't get it working consistently, but that was a few years back and things seem better now.

Top comments (1)

new1333 profile image