loading...

How we decreased load time by 40% by removing just 1 file

mladenstojanovic profile image Mladen Stojanovic ・3 min read

So the title of this article is really clickbait-ish, but in reality it is exactly what happened.

I started working on a project around September last year. Since all of the NDA stuff happened with the client, I cannot write much about names, but it is a big company in their respective field.
They are using a React SSR app to serve their websites which are not source of their income, it is for information, some blog posts, articles etc. Of course they have ads on it, and it is visited decently, but they didn't pay that much attention to it.

I know what you're going to say now: How come that they don't pay that much attention to it but they are using a really modern stack for their "not so important" website?

Well they wanted the previous team to build it with cutting edge tech since they were redesigning and recreating everything, and as you guessed, using some cutting edge technologies early in its adoption will leave you with (some) negative outcomes, but more on that in other posts (hopefully).

Back to the present, my team and I got in as a part of a team that will maintain and add features to this (and several other) projects for this client. From the first day it really annoyed me how the app is slow, even though it is created as a server side rendered react app, it should be lightning fast! New Relic stats were coming in every week and showed something like 10 or 11 seconds of average loading time, it was really crazy!

Couple of months in, I was at a tech conference where I attended a talk and workshop by Harry Roberts (https://twitter.com/csswizardry) a performance consultant where I was really inspired by his biggest point in the whole presentation:

Your website's performance will increase as soon as you start paying attention to it

Or something like that, it was a long time ago :)

But from that point I actually started paying attention to this app's performance.
In between standard issues and sprints, I started to analyze what are the biggest problems of our app and really fast I realized that our app's CSS file was 2.9MB unzipped and 1.9MB gzipped.
I was shocked, how does a CSS file reaches this kind of size? It is a big app, but not THAT big. And gzip loves repetition, how this cannot be compressed even more?

Time passed with new tasks and strict deadlines, we've all been there, but that CSS file just couldn't get out of my mind. One day my teammates and I sat down and started looking for a problem. We sorted every css file in the project and found out there was one that was 1.5MB large(!!!)

Someone put 8 decently sized images (around 1500x600px), converted it to base64 and put it in a css file which was loaded every time in a bundle, even when it hasn't been used!
Deleting that file reduced our bundled css file to 1.3MB unzipped, or 700KB gzipped! A huge win!

Later on I tested the website with some performance tools and every single one showed decrease of 40 to 60% in load time! (first paint, full page load etc.)

Moral of the story:

Pay attention to your app's performance, sometimes really small changes can bring you really awarding results!

Cheers!

Posted on by:

mladenstojanovic profile

Mladen Stojanovic

@mladenstojanovic

JavaScript developer currently enjoying working in React!

Discussion

markdown guide
 

I recommend run your website through google's speed test, it will reveal a lot of things.
Things like the one you mentioned will be discovered immediately with no work.

developers.google.com/speed/

 

Have you tried webpagetest.org/ ? I think it is way better than google's speed test, gives a lot more stats!

 

The more tests the better! DevTools has great auditing tools as well!

 

As soon as you said "8 decently sized images", my immediate thought was "please, no. Not base64-encoded images in CSS..."

I have to wonder why someone would do that -- at those dimensions, I'm guessing they're photos?

 

I have no idea, it was 8 photos (4 different photos in 2 sizes) for a 500 page, which kinda makes sense, BUT, they were too big to be base 64 encoded, and they didn't create any sort of loading in chunks or whatever so it was loaded every time the site was entered, crazy!

 

Agreed. There needs to be a reasonable size limit and purpose decided on for doing that sort of thing.

 

Love it!

I found something similar: We were shipping fkn megs of data in our SSR'd JSON payloads. Curled the endpoint, used sed to strip it down to the JSON, a small JS script to unminify it, and then started exploring with jq. After a little bit, realized that 90% of the payload was duplicated at another location in the payload. Turns out we were using some redux and so some toplevel component received the data, hung onto it, and passed it to one of its children. In memory, this isn't an issue b/c they both point to the same spot in memory. But JSON doesn't have a feature for referencing values from other spots in the structure (interesting note: YAML does). So, when we serialized the entire state, that memory got rendered twice. Turns out it wasn't needed by the parent, so we removed it and the initial payload was cut by almost half.

 

Yikes, what a weird problem :)

But that feeling when you delete it and it starts working so fast... Priceless :D