ReactJs is a popular framework/library these days for web-development. It has somehow amassed more following than it's competitors Angular and VueJs. To add to this there is it's successor NextJs which is getting popular due to it's server-side-rendering capability. But why is rendering becoming a design decision these days. Let's dive deeper.
We all know how the web works. When you visit google.com, you are basically requesting google's server to send you the index.html page of google.com domain. Once the HTML is given to the browser, it creates a nice looking DOM tree and shows you the UI. Right? Well, in the earlier days it was like this. But now there is a new way of doing it. Enter client-side-rendering. So what is it.
Put simply there are two ways of rendering or displaying HTML. One, you do it on the server. You take the code convert it to HTML and then send the response to the browser. And then there is client-side-rendering where the server sends the browser a JS bundle (a jumbled+minified JS code) which is used by the browser to generate HTML. Why do we have it.
Well think of it like this. There are a million computers and one server. This server has to convert code to HTML so that those million computers can view the page. Instead what if we sent a jumbled code that these computers can use to create the HTML. We would free up the server and use client side resources. In real life though it's not the server who is doing it all the time. There are usually CDNs that cache these sites to make them load faster.
So CSR is the best. Right? Well not exactly. In CSR it becomes difficult to generate meta tags or crawl sites to show up in google's index. Why is this so. Let's understand this.
Whenever you share a link on facebook, whatsapp, twitter or any other social networking site, you get to see a nice little preview of the webpage. Something like this -
This is generated using the tag section of your webpage. So what is the big deal. Don't CSR pages have tags? Yes they do. But imagine a situation like this. Today most of the websites are dynamically made. That means you load data into a page using AJAX calls. This means what should go into the tag is determined after the AJAX call finishes. Now, since ReactJs is CSR language, when you paste a link, a dynamic preview is not generated since JS has to be executed and the HTML has to be generated. It's not that this can't be done but most of the social-sites have a particular time for which they will wait to find the tags. If it doesn't find it within that time it skips and is not able to show a preview thereafter. So you find a default ugly looking preview.
With SSR this is not a problem because SSR comes with the tags in the first call itself since they are rendered on the server.
But what is with the google's indexing. If you are not familiar with how search works it's okay because most of us don't as it's a proprietary algorithm which the entire industry is still doing guesses at. That is how the entire SEO industry was born.
To put it simply, google has a lot of bots/crawlers/spiders which are nothing but programs that run at regular intervals and scan pages. It starts from a page and keeps visiting the links it finds on that page till the entire Web is scanned. When the spider gets a SSR link, it is able to scan it quickly as it gets the HTML content directly. But when it gets a CSR link it has to wait for that link to load and then scan the contents of that page. This again depends on the amount of time the spider will wait for the page to load. Imagine if you are really cool website about Cats and it takes so much time that google's bot skips it. Your entire effort goes to wait as that page will never get a rank on google so it won't be visible to anyone. Google is making changes to it's algorithm though since most of the websites are being made with React now-a-days.
If you are still obsessed with React there are libraries that take care of the tag preview problem. Helmet is a popular npm package which allows you to change your tags dynamically. If you are using Netlify to host your website you have to tweak an option to make this change visible. They call it pre-rendering. You can find it here -
There are also other websites like prerender.io which make this possible. But these are more like workarounds and not actually solutions.
Think of it like this. They take your website, render it once, change your meta tags and then make it visible. Basically save an HTML of your otherwise JS website. Prerender.io has this option to cache such pages so as to reduce the time incurred when spiders are crawling your website. But then you also have to clear that cache when you make changes to your website.
Now it's up to you. If you feel you have a public facing webpage which has to be ranked by google and has to generate previews on social networking sites, go for SSR. If it's client side speed that you are after - go for CSR.
Two roads diverged in a wood, and I— I took the one less traveled by, And that has made all the difference.
Happy programming !!!