Note: This article was originally written for my personal blog. I am republishing it here for the DEV community.
Server side rendering, abbreviated to SSR, is a commonly misunderstood concept. What is it, and why should you bother learning it? I hope to answer those questions with a story on the origins of server side rendering.
The benefits
Before I get into the story and explanation of SSR, it's worth understanding the benefits first. Why should you care?
๐ Search Engine Optimization: Content is crawlable by search engines so your site and pages will appear in Google search results.
๐ฃ Social Media Optimization: When people post your links on Twitter, Facebook, etc. then a nice preview will show up with the page title, description, and image.
๐ Performance: Server side rendered pages will load faster because the content is available to the browser sooner.
๐ User Experience: Similar to performance, content is available sooner so the user is not waiting around looking at blank pages or loading spinners.
I also did a podcast episode explaining these benefits of SSR:
The origin story
Server side rendering has actually been around since the existence of server programming languages such as Java, PHP, Python, and Ruby. If you've ever written dynamic code in an index.php
file or an entire Ruby on Rails app, then you've already done server side rendering.
That makes understanding this concept a lot simpler. Let's say I have a PHP website where I am retrieving a list of games from a database. It might look like this:
<?php
$game1 = getFirstGameFromDatabase();
$game2 = getSecondGameFromDatabase();
echo "<ul><li>$game1</li><li>$game2</li></ul>";
Data is being retrieved and formatted in an HTML list all on the server. As soon as you view this page in a browser, you don't have to wait for any JavaScript to be run. The data is already available and you'll see the list of games right away.
This is great for everyone including search engines and social media. The data is already available in the source of the page, so web crawlers such as Google or even Facebook could parse this content and display search results or link previews.
Websites were built this way for many decades, but what we didn't see coming is the revolution of websites being written all on the client side using JavaScript.
The JavaScript revolution
Browsers are constantly becoming more powerful meaning that you can do a lot more now with JavaScript than you could 10 years ago. So what did developers start doing? Writing their entire websites and web apps with client side JavaScript.
Yes, I am mainly referring to the usage of single page application (SPA) frameworks. While there were many that came about, Angular is the primary one that popularized this approach. Imagine being able to fetch some data via Ajax, add some special attributes to your markup, and voila: you have a dynamic website without having to mess around with PHP and servers.
One big problem though. Your initial HTML no longer contains all that data that the server was so nicely fetching and returning back for us already in the page.
Now all you have is this:
<!-- ๐ญ My beautiful content is gone! -->
<div id="app"></div>
I'm sure that Google isn't very happy about that, and neither are users. On a slow connection, it may take a while for the JavaScript to make this page useful.
Note: Before you say, "but Google can crawl JavaScript now!", keep in mind that there are limitations and not all web crawlers do the same.
It'd be a pretty sad end if I said that we should stop building apps this way, especially when it's so efficient for developers. Can we have our cake and eat it too?
Universal JavaScript
Here's where it all comes together now. What if I said that we could take the traditional approach of server side rendering and combine it with our JavaScript?
Yes, it's possible! It's all thanks to Node.js which allows for what is known as Universal JavaScript: code that can be run in both a browser and server.
Let's say that we have a simple React component like this:
function GamesList({ game1, game2 }) {
return <ul><li>{game1}</li><li>{game2}</li></ul>;
}
With the component being rendered to the page like so:
const games = <GamesList game1="mario" game2="pacman" />;
ReactDOM.render(games, document.getElementById('app'));
This is all being done on the client side. How can we do the same on the server side? Actually, React provides a method for that:
return ReactDOMServer.renderToString(games);
Now, instead of passing back an empty div, we can have a Node.js server return the full HTML of our React component! Similarly to the PHP code we had earlier.
I did leave data fetching out of this example, but note that it is definitely possible to fetch data in our components on the server side.
Note: You're not losing out on the benefits of an SPA
A single page application (SPA) is popular not only for providing quick development time, but also for its client side routing. This provides a quick navigation experience for the end user, and is definitely something we do not want to lose when we begin server side rendering. Thankfully, you can still choose to use these frameworks on the client side to provide that experience. This means that the initial render uses SSR, but then subsequent navigations are like an SPA.
Using it in the real world
I hope this story helps explain what server side rendering is and why you would want to use it. You're probably wondering how to actually use it though.
While you can start from scratch and try to make your apps run on Node.js, it is a lot of work. You have to figure out how to properly implement data fetching, state hydration, CSS extraction, and many other things.
Thankfully, there are frameworks for this:
Another option for achieving the benefits of server side rendering without the hassle of a Node.js server is using a static site generator. There are of course limitations, such as not being able to have dynamic on-demand routes (e.g. user profiles), but otherwise I definitely recommend taking a look at GatsbyJS. My personal site is powered by Gatsby, which I also wrote about.
I should also mention prerendering which is basically having your own web crawler that can parse JavaScript. The result of that markup is then served to appropriate user agents such as search engines, and the benefit here is that you don't have to change the way your app is written. Prerender.io is a popular service that provides this functionality. Keep in mind though that you're still either maintaing a server or paying for a service, and you don't receive any performance benefits out of it.
There is no definitive answer on which option is best, you just have to weigh the benefits and downsides of each one and decide which is worth it for you.
Top comments (26)
User experience, well specifically interactivity. The user can keep interacting with the application with SPA's. Imagine if a service like Trello was server rendered. The entire page would have to refresh every time a user moved a card, edited a card, etc.
Not to mention the decoupling of frontend and backend technology stacks.
Of course, if it was a page that doesn't require much interactivity and needs extremely fast load times(a blog for example), then SSR trumps SPA without a doubt.
I think there's some confusion here. I am not advocating only using SSR like a traditional monolithic app.
I am, however, advocating having a SPA experience that is also utilizing SSR. The initial render is from the server, and then the client side JavaScript takes over to provide a SPA experience. This means that web crawlers are able to view content on that initial render of any page.
This is probably my fault in not making that clear in the article, so I will probably add that to the "Universal JavaScript" section.
Update: I added a new section to the article to hopefully clear up this confusion.
but you can avoid that using AJAX, are there any drawbacks ?
But using AJAX would mean that it isn't SSR, wouldn't it?
Depends, I meant handling the initial render in the server then handle the interactivity in the browser the way it was for years. By SPA, I assume that the server will return a page with an empty content as you mentioned and the JavaScript will handle the rendering, which is not what I meant by saying "handling it with AJAX". Generally, using AJAX doesn't imply it's a SPA.
Oh right, sorry I didn't realise that's what you were referring to.
However, I've seen a few websites trying that approach and the code was a mess. Probably poor planning. Have you worked with that pattern? How was your experience with it?
yeah it can be messy very quickly, but using a solid architecture and a suitable design pattern will make a huge difference. For the example of Trello, i would create an independent API in the backend and for the front-end part i would use a Pub-Sub design pattern as the application is heavily events driven. Hence, the code will become more readble and extendible.
I just want to give an alternate solution that solves almost all SPA issues, but with a lot less hustle: services (self hosted or 3rd party) for cache, similar with Prerender.io
There is no need to complicate your code, use a specific language, framework or server like nodeJS and is very lose coupled.
It acts like a buffer between your server and internet, when a bot comes it will be served a cached static version of the page.
Thanks for sharing Adrian! I mentioned Prerender.io at the end of the article. It's definitely an option to consider.
Yes, but you trick the reader because there is no performance gain or lose, if you invest a lot of resources in servers maybe you can achieve a faster first page render, if all goes well, but if the user is browsing a 2nd page the SSR will be slower overall.
And also you mentioned only the cons of prerender, actually is a better solution, less code to maintain, less server resources consumtion than a SSR solution and a faster response time because is only a cache, so you will have more performance.
Hopefully this clears things up:
I think there's some confusion here. I am not advocating only using SSR like a traditional monolithic app.
I am, however, advocating having a SPA experience that is also utilizing SSR. The initial render is from the server, and then the client side JavaScript takes over to provide a SPA experience. This means that web crawlers are able to view content on that initial render of any page.
This is probably my fault in not making that clear in the article, so I will probably add that to the "Universal JavaScript" section.
Rendora github.com/rendora/rendora exactly does what you need and it's FOSS
Thanks Sunny for providing these alternatives. Thought, I'm still uncomfortable with the fact that I have to use a specific technology in the back in order to solve the SPA issues. Which is in my opinion makes using traditional ways, with a solid and maintainable structure of course, more convenient.
Which means that
Next.js
and other solutions do these procedures, these are obviously an extra work to do compared to the traditional SSR solutions. What are the drawbacks ? Does it effects the server response time ?Hi Mazen. My points about using Next.js/Nuxt.js is if you're already using modern SPA frameworks like React and Vue, then Next.js/Nuxt.js allow you to continue using those frameworks but with SSR features baked in. You have to use Node.js as a backend technology, but all it's doing is rendering your front end application.
Having said that, you really just have to evaluate your use cases. If you're comfortable using a traditional monolithic framework like Ruby on Rails, Laravel, or Django, by all means go ahead. If you're also just building apps behind a login screen, like a mail client, then you'd be better off building a SPA without any SSR.
As far as drawbacks go, I would say SSR does introduce complexity such as having to run a Node.js server. Performance is pretty good when using a framework like Next.js, but that doesn't mean that you can't slow things down by fetching too much data server side or just having a slow server.
MeteorJS: out of the box SSR, GraphQL (Apollo), support for Prerender.io, monitoring "to dot" of all important things on the server (Node and code error reporting, time taken to run methods, publications, CPU% and RAM% etc). With Meteor you can do SSR and Prerender at the same time if you really need it and makes you happy. For instance, take FB to SSR and Google to Prerender or push to Prerender if you do SSR by default and your server hits the 80% CPU.
This is from my Prerender log today: 2019-08-07T16:40:50.164Z got 200 in 8ms for xxxxxxxxx.com/. So if 8ms is not enough for you sure, try SSR :).
Now with Meteor you can deploy pretty much everywhere including the Meteor hosting - Galaxy. Pretty large community and adoption and if you need more ... Meteor is reactive by default, includes Cordova builds for both Android and IOS, code splitting, supports Blaze, React, Vue, Angular and possibly others, has all the necessary pieces to use it as backend for React Native.
I am not paid by Meteor. I just use it, love it and wanted to share my enthusiasm with you.
Some learning lessons while working on server side rendered apps at scale:
It used to be that a lot of companies showed two different versions of a webpage: 1 for bots(you find this out by the user agent of the request) & 1 for regular users.
Maintaining two different versions of a page while working became a maintenance nightmare for my team and I on one codebase we worked on.
We now just have one version of a page.
If your doing your job right & creating valuable content for users, using the appropriate tags on your pages, Schema.org values and using Google Search console you shouldnโt need to versions of a page.
I really need to write an in-depth article about this.
I'm going to look up Nuxt! Thanks for holding us accountable! One of the things I did in my plain Node, Express, Mongoose project; I had a router dedicated to rendering the EJS templates. I would access the router with AJAX to have Node render and return the EJS templates to the frontend.
Happy to hear that Michael. I used to do something similar as well haha.
I am confused about your social media optimization point. if facebook uses SSR(i read that somewhere) then why preview of facebook posts' is not added when we post its links on any site e.g twitter? You'd have obviously noticed that.
The "social media optimization" meant that the social medias will fetch a preview of your website. But this won't happen out of sudden, you have to define some specific meta tags that tell the social media's crawler what to fetch. If you check the source code of a Facebook's post you will notice that there's no meta tags ( twitter cards ) that tell the Twitter's crawler what to fetch, so no wonder why twitter is blind when it comes to Facebook's posts.
Yes exactly. Even in an SPA, you can provide a title and meta tags. However, the title is only able to be updated via client side JavaScript, which Facebook is unable to parse. If I share a page of your SPA on Facebook, then it will still show the title, description, and image for your homepage.
If you would like to have more control over your SSR routing consider using Razzle
Actually prerendering is still the best, as it is only for crawlers. The user experience because of SPA is tons of time faster. I use a Redis cache and only prererender a page every week.
Why not just serverside render the first page & lazily lod your JavaScript?
Food for thought: dev.to/cliffordfajardo/comment/8984
I may have stirred up some confusion here, as I think you can have both an SPA and SSR experience in one:
I think there's some confusion here. I am not advocating only using SSR like a traditional monolithic app.
I am, however, advocating having a SPA experience that is also utilizing SSR. The initial render is from the server, and then the client side JavaScript takes over to provide a SPA experience. This means that web crawlers are able to view content on that initial render of any page.
This is probably my fault in not making that clear in the article, so I will probably add that to the "Universal JavaScript" section.
SPA are more performant than SSRs, the only difference is the landing page, and sometimes not even that.
The performance user gains by using a SPA is linear with the number of visited pages.