This was originally posted on my own blog a while ago. I recently joined dev.to after lurking for a while and thought I'd contribute something.
I visited a website recently that "weighed" 22MB. Twenty two megabytes, Gzipped.
Over recent years there's been an increase in average page weight, weight in this context being the overall size of all the images and code that make up a website. If none of that means anything to you, then let me just say that 22MB for a website is ridiculous. The pages of our travel blog, theransleystravel are filled with images from our adventures and the pages still come in at around 10MB (which I'll admit is more than they really should be).
But visit a selection of websites, whether news sites, blogs, ecommerce websites or any other type and you'll probably find that most are way bigger than they should be – 5MB to 10MB pages aren't uncommon.
This weight gain causes problems. It's usually not noticed by designers or developers, or business owners while they view the site on a desktop or higher powered laptop on fast broadband. In fact, the site I've decided to pick on loaded acceptably quickly on the high speed fibre connection at my office.
But most of your users are in western countries you say, they do have fast connections, so this doesn't affect me and anyway my site isn't that big. But the problems come when your users aren't on super fast broadband connections. It comes when they're on the train, speeding through patchy 4G or, even worse… 3G – god forbid Edge. Problems come when your user is in Thailand, trying to load your website with their travel SIM, which has precious limited data. Or even for the majority of American users on metered connections. Or if your app or website is for users permanently living with slow internet speeds.
Those 22MB websites add up.
And the vast majority of the time, there's no need for a website to be that big. It just happens by accident. It happens as a consequence, largely, of platforms like WordPress, Wix and other site builders. It happens when the people who edit content through their CMS or site builder aren't aware that they shouldn't upload full resolution photos straight out of a 50MP camera. Of course, all being well the site would be set up to scale and optimise these images as necessary, so images can be uploaded at whatever size and delivered to users at the correct size. Too often though, this combination of poorly set up sites and lack of knowledge combines and leaves us with sites that load several ~4MB images in a hero slider when they should really be less than 4MB for the entire page combined as a rough guide.
Recently I rebuilt and launched a new site for Fishpie, where I started working in December. Their old site was built using WordPress, which in itself isn't a bad thing. Let's get this out the way first, I think WordPress is a tool like any other, I think can we used well and build great sites and I think it can be used poorly and used to build bloated, slow sites just like any other tool. Unfortunately it seems that for a variety of reasons, WordPress sites are often on the heavier side.
We'll skip over the various other requirements and processes that we went through when re-developing the site and concentrate on just page weight and performance.
The existing Fishpie site was far from the worst offender for bloated pages, but there was still some room for improvement. Taking note of the weights of a selection of the pages it seemed the rough average was 3MB, which was slightly above the web's apparent average but in reality I think it's probably well below the current median and in the grand scheme of things is acceptable for a site of this nature.
But I thought there was room for improvement. So, with that 3MB figure in mind I decided my measurement for improvement would be to cut that in half and use that as my maximum, not just my average. I'd aim to build the new site with a maximum of 1.5MB per page. This is quite a small number for a webpage these days, with images and other assets to load.
Though the Fishpie site isn't high traffic, this reduced page weight would reduce loading times for everyone, not just those on slow connections. It would also reduce the load on the server, the actual effect being pretty negligible for a site of this size but it's still a positive.
How did I plan to cut page weight in half? Well, with this project I had the freedom to completely rebuild the site, using whichever packages/frameworks/platform I chose which helped.
I chose to build the new site with Statamic, a relatively little-known CMS, and while this CMS has proved to be a great tool it's not alone in being able to produce lightweight sites.
The main reason I thought I would be able to build pages below 1.5MB was because I knew I was starting with a 0MB page and would add in only the things I needed. Instead of starting with a say 2MB+ page from a WordPress template, before I've even added any content. This meant each additional asset was carefully considered, rather than being included by default.
Another quick win when trying to reduce page weight is to re-size and compress your images. This is normally the number one cause of massive pages.
Statamic provides the functionality to resize and compress images as they are served and then serve the cached version on subsequent requests. It's not the only platform to offer this, but it's certainly the slickest I've worked with. This means you or your users can upload whatever size images they like (within the limits set by your server) and it will resize and compress them as needed.
If you're working with an existing site that doesn't provide the functionality, or you're building a static site then the best way to achieve the same thing is to manually resize your images and compress them. Image editing software, like Photoshop and it's alternatives are great for resizing images and will go a long way to reducing the file size. You can get even further with a tool like ImageOptim which is Mac only and my preferred tool, but just search for image optimisation and you'll find something that works for you (Squoosh seems pretty good).
Another simple way of cutting down on page weight is to simply request fewer assets from the server. Open up the network tab of your favourite browser and take a look at what your site is requesting.
Go through each asset and ask if it's needed. Get a development site running (if you haven't already you really should) and remove the files in groups or individually as needed and test the site each time.
Arguably the negative impact of loading so many files on a page has decreased over time but reducing unnecessary requests will still be a benefit.
GZIP. It's like
.zip files for the web. It's a feature which your hosting provider (or VPS if you have one) should offer by default or as an option and you should definitely enable it. Beyond enabling it its a zero effort way to instantly save some precious KB's. You'll only see the benefit on text files – like CSS, JS and the actual HTML but it's still worth doing.
- Compress your images. This is generally the biggest, easiest saving.
- Cut down on unnecessary requests for things like old JS and CSS.
- Enable GZIP.
There are plenty of other things you can do, but these should get you pretty far.
Bonus tip: If you serve users pretty exclusively in one location then choosing a server based in or near to them will help your site feel a little quicker. Or go for the next step of setting up a CDN and reducing the load on your server and serving to everyone as if you were local to them.