Note: This article was originally written for my personal blog. I am republishing it here for the DEV community.
Server side rendering, abbrevia...
For further actions, you may consider blocking this person and/or reporting abuse
User experience, well specifically interactivity. The user can keep interacting with the application with SPA's. Imagine if a service like Trello was server rendered. The entire page would have to refresh every time a user moved a card, edited a card, etc.
Not to mention the decoupling of frontend and backend technology stacks.
Of course, if it was a page that doesn't require much interactivity and needs extremely fast load times(a blog for example), then SSR trumps SPA without a doubt.
I think there's some confusion here. I am not advocating only using SSR like a traditional monolithic app.
I am, however, advocating having a SPA experience that is also utilizing SSR. The initial render is from the server, and then the client side JavaScript takes over to provide a SPA experience. This means that web crawlers are able to view content on that initial render of any page.
This is probably my fault in not making that clear in the article, so I will probably add that to the "Universal JavaScript" section.
Update: I added a new section to the article to hopefully clear up this confusion.
but you can avoid that using AJAX, are there any drawbacks ?
But using AJAX would mean that it isn't SSR, wouldn't it?
Depends, I meant handling the initial render in the server then handle the interactivity in the browser the way it was for years. By SPA, I assume that the server will return a page with an empty content as you mentioned and the JavaScript will handle the rendering, which is not what I meant by saying "handling it with AJAX". Generally, using AJAX doesn't imply it's a SPA.
Oh right, sorry I didn't realise that's what you were referring to.
However, I've seen a few websites trying that approach and the code was a mess. Probably poor planning. Have you worked with that pattern? How was your experience with it?
yeah it can be messy very quickly, but using a solid architecture and a suitable design pattern will make a huge difference. For the example of Trello, i would create an independent API in the backend and for the front-end part i would use a Pub-Sub design pattern as the application is heavily events driven. Hence, the code will become more readble and extendible.
I just want to give an alternate solution that solves almost all SPA issues, but with a lot less hustle: services (self hosted or 3rd party) for cache, similar with Prerender.io
There is no need to complicate your code, use a specific language, framework or server like nodeJS and is very lose coupled.
It acts like a buffer between your server and internet, when a bot comes it will be served a cached static version of the page.
Thanks for sharing Adrian! I mentioned Prerender.io at the end of the article. It's definitely an option to consider.
Yes, but you trick the reader because there is no performance gain or lose, if you invest a lot of resources in servers maybe you can achieve a faster first page render, if all goes well, but if the user is browsing a 2nd page the SSR will be slower overall.
And also you mentioned only the cons of prerender, actually is a better solution, less code to maintain, less server resources consumtion than a SSR solution and a faster response time because is only a cache, so you will have more performance.
Hopefully this clears things up:
I think there's some confusion here. I am not advocating only using SSR like a traditional monolithic app.
I am, however, advocating having a SPA experience that is also utilizing SSR. The initial render is from the server, and then the client side JavaScript takes over to provide a SPA experience. This means that web crawlers are able to view content on that initial render of any page.
This is probably my fault in not making that clear in the article, so I will probably add that to the "Universal JavaScript" section.
Rendora github.com/rendora/rendora exactly does what you need and it's FOSS
Thanks Sunny for providing these alternatives. Thought, I'm still uncomfortable with the fact that I have to use a specific technology in the back in order to solve the SPA issues. Which is in my opinion makes using traditional ways, with a solid and maintainable structure of course, more convenient.
Which means that
Next.js
and other solutions do these procedures, these are obviously an extra work to do compared to the traditional SSR solutions. What are the drawbacks ? Does it effects the server response time ?Hi Mazen. My points about using Next.js/Nuxt.js is if you're already using modern SPA frameworks like React and Vue, then Next.js/Nuxt.js allow you to continue using those frameworks but with SSR features baked in. You have to use Node.js as a backend technology, but all it's doing is rendering your front end application.
Having said that, you really just have to evaluate your use cases. If you're comfortable using a traditional monolithic framework like Ruby on Rails, Laravel, or Django, by all means go ahead. If you're also just building apps behind a login screen, like a mail client, then you'd be better off building a SPA without any SSR.
As far as drawbacks go, I would say SSR does introduce complexity such as having to run a Node.js server. Performance is pretty good when using a framework like Next.js, but that doesn't mean that you can't slow things down by fetching too much data server side or just having a slow server.
MeteorJS: out of the box SSR, GraphQL (Apollo), support for Prerender.io, monitoring "to dot" of all important things on the server (Node and code error reporting, time taken to run methods, publications, CPU% and RAM% etc). With Meteor you can do SSR and Prerender at the same time if you really need it and makes you happy. For instance, take FB to SSR and Google to Prerender or push to Prerender if you do SSR by default and your server hits the 80% CPU.
This is from my Prerender log today: 2019-08-07T16:40:50.164Z got 200 in 8ms for xxxxxxxxx.com/. So if 8ms is not enough for you sure, try SSR :).
Now with Meteor you can deploy pretty much everywhere including the Meteor hosting - Galaxy. Pretty large community and adoption and if you need more ... Meteor is reactive by default, includes Cordova builds for both Android and IOS, code splitting, supports Blaze, React, Vue, Angular and possibly others, has all the necessary pieces to use it as backend for React Native.
I am not paid by Meteor. I just use it, love it and wanted to share my enthusiasm with you.
Some learning lessons while working on server side rendered apps at scale:
It used to be that a lot of companies showed two different versions of a webpage: 1 for bots(you find this out by the user agent of the request) & 1 for regular users.
Maintaining two different versions of a page while working became a maintenance nightmare for my team and I on one codebase we worked on.
We now just have one version of a page.
If your doing your job right & creating valuable content for users, using the appropriate tags on your pages, Schema.org values and using Google Search console you shouldn’t need to versions of a page.
I really need to write an in-depth article about this.
I'm going to look up Nuxt! Thanks for holding us accountable! One of the things I did in my plain Node, Express, Mongoose project; I had a router dedicated to rendering the EJS templates. I would access the router with AJAX to have Node render and return the EJS templates to the frontend.
Happy to hear that Michael. I used to do something similar as well haha.
I am confused about your social media optimization point. if facebook uses SSR(i read that somewhere) then why preview of facebook posts' is not added when we post its links on any site e.g twitter? You'd have obviously noticed that.
The "social media optimization" meant that the social medias will fetch a preview of your website. But this won't happen out of sudden, you have to define some specific meta tags that tell the social media's crawler what to fetch. If you check the source code of a Facebook's post you will notice that there's no meta tags ( twitter cards ) that tell the Twitter's crawler what to fetch, so no wonder why twitter is blind when it comes to Facebook's posts.
Yes exactly. Even in an SPA, you can provide a title and meta tags. However, the title is only able to be updated via client side JavaScript, which Facebook is unable to parse. If I share a page of your SPA on Facebook, then it will still show the title, description, and image for your homepage.
If you would like to have more control over your SSR routing consider using Razzle
Actually prerendering is still the best, as it is only for crawlers. The user experience because of SPA is tons of time faster. I use a Redis cache and only prererender a page every week.
Why not just serverside render the first page & lazily lod your JavaScript?
Food for thought: dev.to/cliffordfajardo/comment/8984
I may have stirred up some confusion here, as I think you can have both an SPA and SSR experience in one:
I think there's some confusion here. I am not advocating only using SSR like a traditional monolithic app.
I am, however, advocating having a SPA experience that is also utilizing SSR. The initial render is from the server, and then the client side JavaScript takes over to provide a SPA experience. This means that web crawlers are able to view content on that initial render of any page.
This is probably my fault in not making that clear in the article, so I will probably add that to the "Universal JavaScript" section.
SPA are more performant than SSRs, the only difference is the landing page, and sometimes not even that.
The performance user gains by using a SPA is linear with the number of visited pages.