DEV Community

Cover image for Can you make your website green πŸŒ³β™»πŸŒ³?
Federico Navarrete
Federico Navarrete

Posted on • Updated on • Originally published at supernovaic.blogspot.com

Can you make your website green πŸŒ³β™»πŸŒ³?

Did you know that any website (including yours) produces COβ‚‚ β›½πŸ€”?

A couple of years ago, we interviewed Xavier VΓ‘squez: The Future of Research

Xavier shared with us that he was analyzing the electricity consumed ⚑ by every app we ran on our devices πŸ’»πŸ“±.

Indeed, every swipe in Tinder or Bumble πŸ’˜ affects the environment 🌍, so be careful how often you swipe!

Coming back to the serious part, last year before COP 28 β›½, I saw a considerable trend of companies and startups offering new solutions that could "help" you become more sustainable β™».

I felt them like buzzwords since I saw even politicians moving from their previous roles to become "ecopreneurs" only to have a spot at COP 28. No harsh feelings, but I felt most of these initiatives were trying to profit from the UN event, not trying to save our planet.

That's why, I decided to wait until the hype was over to analyze some solutions. I was particularly curious about the ones provided for supporting websites' eco-transformations. Before 2017, I was mainly a backend developer, but after I got into web development heavily. That's why I wanted to test these websites' carbon footprint calculators:

Most solutions were not exactly very helpful since their suggestions were vague:

  1. Your website consumes 88% more COβ‚‚ than the others we have analyzed!
  2. Your website produces 2.88g per visit! It's a bunch per year.
  3. Ups, you're not running on a Green Hosting!
  4. Your images or code are not optimized!

That pretty much summarized their results (if you didn't pay for an audit). I even requested an audit from another provider, and the results were not better.

Ecograder, on the other hand, gave me extremely valuable insights, Ecograder report:

result 1
Ecograder report

result 2
Potential improvements

Thanks to Ecograder, I learned the following 14 lessons:

  1. To make my website more efficient and beautifully crafted on every device, watch, mobile, tablet, desktop, etc, with minimal impact on its performance.
  2. To create custom Icon Fonts. It reduced the complexity of my website and improved its accessibility.
  3. To identify where I had some technical debt (unused code).
  4. To use modern code. I used too much legacy code that was convoluted and complex. In short words, heavy.
  5. To prioritize which Images I can compress and resize and which I cannot.
  6. To remove plugins that never worked.
  7. To use modern image formats like WebP instead of JPG or PNG only.
  8. To resize images to their appropriate sizes.
  9. To lazy load images that are not visible (to scroll to see them).
  10. To reduce the amount of calls to the hosting. I tried reaching a maximum of ten by combining some files into one.
  11. To use dark themes. I already used them since I like them more, but it was a nice knowledge to acquire.
  12. To load scripts dynamically until the website is fully loaded.
  13. To lazy load images in internal invisible iFrames.
  14. To use CDNs (Content Delivery Networks) whenever you must use full 3rd party libs like Bootstrap or jQuery.

Can this be automated? It seems like a lot of work ...

Indeed, it's tons of work. Some parts can be automated, and some parts cannot be fully automated. Also, there are several considerations if you want to go as green as possible. Let's analyze some of them:

  • Ecograder can help you identify which JS or CSS files have extra code. CSS can be "cleaned" with PurgeCSS. But JS is more complex and might involve a lot of re-engineering. Also, most solutions will suggest you remove Google Analytics (I highly doubt you want to do it).

result 3
Google Analytics

  • Generative AI solutions like GitHub Copilot or ChatGPT might help you find code improvement points, but they are bulletproof and a small change can break too many things. I used them to refactor some sections of my code, but it was a mess.
  • Modern solutions like SPA (Single Page Applications) will be more efficient in new cases (mainly for complex websites) since you will make fewer calls to the server and the code is cleaner and nicer. However, if you didn't build your website with these solutions and your budget is tight, you must consider a different process. Additionally, SPAs are not always so great at handling unused 3rd party CSS files since you would need to do some magic with PurgeCCS.
  • PNG/JPG images can be auto-converted into WebP via APIs. Also, some APIs can resize and compress them automatically. So, you have the full package. However, you must be very careful with your brand images. Any image that is at the beginning, should never be pixelated. Heavily automated optimizations might damage them if you don't exclude them.

result 4
Unoptimized brand image

  • Changes in your website can be detrimental. No website is static, your brand evolves, you change over the years, etc. If your new image or text is large or heavy, it might have a bigger impact on your score.
  • Moving to a Green Hosting might have considerable consequences if you don't know its architecture. As a hosting user, you often have monthly or yearly contracts, so you cannot easily change it overnight. Additionally, what if you move it to a service that is not poorly accessible from a farther location? Let's say you have new clients from India, you are located in Sweden, and your new green hosting is in Ireland. Your website might become slow since it's far from India. This will be hard to optimize if you don't know about software architecture and your new providers don't have edge locations near India.
  • The calculators don't analyze each scenario. My website has four different user experiences: Desktop, Mobile, Tablets, and Watches. Each experience has specific UIs and optimized images. Therefore, I don't think my website will produce the same amount of COβ‚‚ if a user accesses it from their mobile as from their desktop.

Is there anything else I did and didn't mention? Indeed, I "lazy" loaded sections based on the user's scroll location. However, this might or might not benefit your impact since the calculators analyze the full loading, not only the first impression. I see it as nice to have, but might not be as eco-friendly as we think. I might need to have a longer research to analyze its results.

This pretty much summarized my experiment to transform my website into a Green one β™». The best result I got yesterday (Feb 15th, 2024) in the 1st solution:

result 5

federiconavarrete.com

I was able to move from D to B. The only greater improvement I could make was to switch hosting providers. However, that's something I wouldn't do. For now, GitHub is transitioning to be a Green company for 2025 and be carbon-negative for 2030. So, they are already on the way. I can wait for a little longer.

Hopefully, this article can help you in your Digital Ecotransformations and bring the best of them to your websites.

Update 2024-02-28:

I reached an A in the Website Carbon Calculator after some adjustments without a Green Hosting:

result 6

Website Carbon results

Update 2024-04-11:

I reached an A in the Ecograder after several adjustments without a Green Hosting:

Result 7

Ecograder results

Follow me on:

Personal LinkedIn YouTube Instagram Cyber Prophets Sharing Your Stories
Personal LinkedIn YouTube Instagram RedCircle Podcast RedCircle Podcast

sponsor me

Top comments (28)

Collapse
 
bezpowell profile image
BezPowell

Modern solutions like SPA (Single Page Applications) will be more efficient in new cases since you will make fewer calls to the server and the code is cleaner and nicer.

My personal experience has generally been that SPAs are almost always less efficient. A traditional website renders content on the server side and sends that to the client as HTML, a language it can parse and render natively and has had decades' worth of performance optimisations to process.

SPAs, on the other hand, first have to send a (typically large) JavaScript file to the client that must be parsed and executed, just to manage things like hydrating content and page navigation that the browser does natively. Any content is then sent as JSON, a language that has to be offloaded to JS to be parsed, before the page is built by the client.

A good rule of thumb for performance (and carbon emissions, as the two often go hand in hand) is to use browser native functionality wherever possible. Endlessly reinventing the wheel for the sake of perceived developer convenience is rarely better for performance. There are, of course, web applications that do work better as an SPA, but it's a mistake to try and build traditional websites this way in the belief it will improve their performance.

There's a guideline for this in the currently draft W3C Web Sustainability Guidelines: 3.23 Take Advantage of Native Features

Collapse
 
joelbonetr profile image
JoelBonetR πŸ₯‡

TLDR (lol) but anything that's client code has two big measurements:

  • The size of the build
  • The client's device

To deal with the former make sure you run through a bundler adequately configured to minify your code, make sure the server has the compression enabled and so on.

The later depends on how efficient the device running the site is, E.g. a macbook M2 will waste way less energy than a macbook i9 to load the same website.

Of course, if you go the SSG way you'll find benefits in both 😁

Collapse
 
fanmixco profile image
Federico Navarrete

They are good topics, but here we're coming to a couple of things, can you modify your server? What if you cannot? I see it from the final users' perspective, they might have hired a hosting for X period of time, and they can only upload their stuff. They have no control over their infrastructure. In this case, you cannot create a better cache since you rely on your provider. Indeed, you can always change providers, but that means to reconfigure everything. I don't think it's the right way to go.

Where do people run your website? I'd say most of them expect to run them from their phones. So, it will be a similar case to the MacBook M2, but similarly, you can upload considerable images, and the result will be that your website is heavy and slow, increasing its carbon footprint.

Thread Thread
 
joelbonetr profile image
JoelBonetR πŸ₯‡

I mean that there's correlation here, if everybody uses more efficient devices and everybody tries and reduces the size and computational complexity of the software as much as possible, we'll reach the peak in that subject.

PD: even if you cannot modify your server because you got a shared hosting (which accounts for 37.64% of the hosting market share as per demandSage statistics) you can use an external service to cache your statics (a.k.a. scripts, images, videos, html files...) e.g. Cloudflare

Thread Thread
 
fanmixco profile image
Federico Navarrete

Do you have any tutorial on how to upload the files only to Cloudflare? I was able to create one, but I was able only to move only my entire website, not only the scripts or images.

Thread Thread
 
joelbonetr profile image
JoelBonetR πŸ₯‡

You don't need to "move" your files into it.
You can have the website somewhere else and configure your DNS so it is Cloudflare the one answering the requests to your website. Cloudflare will then answer the request with the most recent cached version. It will handle the cache automatically for the most part.

Learn more here 😁

Collapse
 
fanmixco profile image
Federico Navarrete • Edited

It's an interesting thought, but what if your client only has an S3? I have a friend who faced that limitation. How do you deal with it? I'd like to know your thoughts.

SSRs are an amazing solution when you have your own hosting or have your own server, cluster, etc., which involves certain costs that not everyone wants to pay if they were considered in the beginning. That's why, I focused on what I could fix without moving to other providers or solutions.

Collapse
 
ahmadadibzad profile image
Ahmad Adibzad

Great post! Lazy loading pagination is another useful technique to reduce requests to the server. When the user wants to go back to previous pages, the past results are already there, so the server doesn't have to send them again.

Collapse
 
fanmixco profile image
Federico Navarrete

Indeed, I did something "similar," I loaded on demand the sections based on the scroll location. However, that's not something so "visible" for the calculators. From what I can see, they fully scroll your website to analyze the full impact. Or maybe you can share with us your technique. It will be worthy to know it, Ahmad. Thanks!

Collapse
 
golu360 profile image
Abhishek Mishra

uhhh, cache-control headers?

Collapse
 
fanmixco profile image
Federico Navarrete

If I could configure GitHub pages, it would be a great solution, but I cannot.

Collapse
 
steve-lebleu profile image
Steve Lebleu

Thanks for sharing. It’s one of the great subjects on the table in 2024. I did some tests with each one, and there are my resume:

  • Each tool gives a different result from the others. It’s quite normal. Websiteemissions is quite different than the others.
  • Only the last two links are giving technical informations about the result, and how to improve it. Again websiteemissions is the badest since he’s just saying you’re a bad guy 😎 Ecograder is indeed the most relevant, because the explanations about the result are complete.
  • If I’m right, unfortunately, none of them explains clearly, with details, how the result is calculated. We know that’s mostly related to the number of requests and the weight of the resources, but the others criterias are not clear. Same for the ponderation.
  • Detail: some tools are indicating my hosting as green, some others not.

A good similar tool in french: ecoindex.fr/

Collapse
 
fanmixco profile image
Federico Navarrete

The French one seems quite odd or even "fake" or even lower quality. I have tested many solutions and the results are crazy:

Page Rank:
168240 / 250135: Too heavy
1,432 MB: Too complex
778 items: Too many requests
131 requests

ecoindex.fr/resultat/?id=792f0978-...

I don't think the results are quite accurate since it took almost nothing to analyze it and the results are drastically different from any other calculator.

Collapse
 
steve-lebleu profile image
Steve Lebleu

Hmmm strange. For the test I did the result was quite similar to the others tools - except websiteemission. ANd quite logicial, as well.

Let's think about a balance between results of different tools, it's probably the best way to deal with.

Thread Thread
 
fanmixco profile image
Federico Navarrete • Edited

I tried 3 websites that I know. Two are hosted in green hostings and mine (normal hosting AKA GitHub Pages). All other calculators got similar results, but Ecoindex.fr was extremely different. Furthermore, it didn't make anything clearer to me and its suggestions were not exactly very useful.

Ecoindex.fr Suggestions

Ecograder, on the other hand, suggested to me very specific actions to take, for example, a certain CSS (Bootstrap, for example) was not optimized, or a specific JS was not minimized.

Ecograder preview

I would say between the common calculators and Ecoindex.fr, there are widely unrelated? standards since their conclusions are considerably different. For example, Ecoindex.fr highlights the number of elements that websites have.

Number of elements

This is something that none of the other calculators mentioned and is difficult to fix in most cases. You can run Lighthouse to analyze your website's performance, and I've never had a recommendation that told me, "Too many Divs! Remove some of them!" It's a very strange standard to consider. Also, it was the only calculator that did not mention the lack of Green Hosting on my website. This was very odd. I personally cannot recommend Ecoindex.fr for now.

My conclusions, for now, would be:

  1. All other calculators have considerable flaws and Ecoindex.fr is amazing.
  2. All other calculators are on the right track and Ecoindex.fr must check its standards since something seems to be fishy.

Here are my results using most calculators:
Previews results

Sources:

Thread Thread
 
steve-lebleu profile image
Steve Lebleu

Yes, as I said it's strange. I did the same test with my own website, and the results was differents, but not at this point. And the badest there was websiteestimation. Anyway. After, I'm not completely with your remarks against ecoindex about the complexity of the DOM, because it's an indicator. That's said, I don't know exactly how it's used in the ponderation, so...

Collapse
 
fanmixco profile image
Federico Navarrete

I even checked a heavier website that I know and the results were as bad as mine. I think something is wrong. Both of us ranked E. It made no sense. Also, I don't know why my website will be 1 Megabyte and something. It's odd.

Collapse
 
mrlinxed profile image
Mr. Linxed • Edited

Thanks for this article. I checked it out, and although I am not scoring badly, I can make some improvements on my website.

I'll pick this up sometime soon.

Collapse
 
fanmixco profile image
Federico Navarrete

Share your score! It's good for everyone!

Collapse
 
balagmadhu profile image
Bala Madhusoodhanan

love the content... wrote something similar blog

Collapse
 
ingosteinke profile image
Ingo Steinke

You might add greenwashing. The automated audit tools put a green check mark for supposedly green hosting on so many websites that run on international clouds unlikely to use no fossil fuel in any country. You might also add micro-optimization. Audits like lighthouse / page speed insights always have some nitpicking left, including dubious advice that might backfire, and so do some manual optimization checklists. And as others already said, "modern technology" like SPAs and AI can also rebound and waste more energy than it will ever save.

Collapse
 
fanmixco profile image
Federico Navarrete • Edited

Nothing is perfect, and there will be always biases. You might skip or one two things. The same will happen in anything in our world. I remember when I was in charge of building the first recommender system for taxes in a company. In some countries, a bottle of soap had many things to be taxed and in others fewer. Therefore, some countries were greener and others less clean. However, I won't jump to the conclusion that is full greenwashing. If you can minimize your code and make it faster, then it's saving resources, therefore, it's cleaner.

About SPAs and AIs, you're mixing two topics as one. In my analysis, I considered SPAs more efficient mainly for complex apps. I didn't build my website with Angular or React, and I don't think it will be a good practice since it's not a complex app. It has a certain complexity since I create many things dynamically, but I don't have multiple pages, log-ins, etc.

When I spoke about AI or LLMs, I focused on how GitHub Copilot or ChatGPT can help you identify code that can be optimized faster. Visual Studio for example has a powerful feature called Intellisense. It was introduced decades ago. It analyzes all your files and helps you remove redundant code while optimizing it. What's more, you don't need all files in one to analyze (if all is correct). This is where I can see LLMs can help you. Let's say you have a large website, how do you know where you can optimize something? At least, I don't everything about JS or CSS (especially yearly changes), there might be something I'll forget or it can have a considerable impact, but I didn't care about it. Not optimizing this code can have considerable consequences if your website has a large traffic. The less technical debt we have, the more efficient it will be.

Collapse
 
mattryanmtl profile image
Matt Ryan

Any services running on AWS Region Canada Central will be green, as the data center, if I recall correctly, is located in Quebec.

94% of Quebec is powered by hydro-electricity.

Collapse
 
fanmixco profile image
Federico Navarrete

Yes, but I'd not focus on that point. The Green hosting is not the solution to any problem. I have analyzed other websites that are green hosted and are F. Why? Because they don't follow any standards. Also, moving to a green hosting can have consequences if your client (or you) have a contract with the hosting provider. You cannot always cancel a contract without repercussions. In my case, I could move it anywhere, but that's not for everyone.

Collapse
 
mattryanmtl profile image
Matt Ryan

Good point. Thank you

Collapse
 
martinfjant profile image
Martin Falk Johansson

Making an SPA does not make a page green? It just causes bloat. Unless you're making an advanced app, the only thing and SPA does it waste resources. And with all due respect, but how in the name of all things holy could making an SPA from what you already have be automated?!

Using AI to make the webpage more environmentally sound is paradox in itself, since the use of LLMs eats a lot of processing power, i.e. wastes energy and burps out CO2.

You also seem to miss that the heaviest resource on the internet, and the one that has ballooned the last decades, is js. It's not images. SPAs are full of JS, unnecessary JS that does things the browser already does.

Collapse
 
fanmixco profile image
Federico Navarrete • Edited

The first point can be fair. It depends on what you need, I'm not saying that for a personal website, you should use Angular or React. I built my own website with HTML, JS, and CSS only.

Second point, how do you know which sections are not optimal in your own JS/CSS code? How do you know how to optimize them? At least I don't everything and can forget one thing that can become a bigger piece of code that GitHub Copilot or ChatGPT can catch faster than me.

Indeed, the first time, optimizing code using Copilot will consume more resources, but if your website is visited a lot (thousands or millions of viewers per month), then, it will save more resources in the long term after the optimizations. Again, I don't know all the JS/CSS changes over the years and cannot know them. This is where I see the benefit of using LLMs to improve my code efficiency.

Collapse
 
joelbonetr profile image
JoelBonetR πŸ₯‡

My good old portfolio has an A+ (0.05g/visit) πŸ˜‚

I assume it's due to computational load, it's a static site with almost no JS in the dist (CSS Only approach whenever possible).