It has similar ideas and I generally agree, except one bit
Google and other search engines use a ‘real’ browser to index your pages.
Not quite correct. It has two phases, and first phase is done without browser. So prerendered pages get indexed faster. Google uses Chrome 40-ish for indexing, so it means your SPA should work in Chrome 40-ish. A lot of SPAs don't provide polyfills for it and basically fail to render, and googlebot sees JS error instead of content. So prerendered content (SSR or prerendering) is always a safe bet.
Googlebot uses a web rendering service (WRS) that is based on Chrome 41 (M41). Generally, WRS supports the same web platform features and capabilities that the Chrome version it uses — for a full list refer to chromestatus.com, or use the compare function on caniuse.com.
It has similar ideas and I generally agree, except one bit
Not quite correct. It has two phases, and first phase is done without browser. So prerendered pages get indexed faster. Google uses Chrome 40-ish for indexing, so it means your SPA should work in Chrome 40-ish. A lot of SPAs don't provide polyfills for it and basically fail to render, and googlebot sees JS error instead of content. So prerendered content (SSR or prerendering) is always a safe bet.
The last time I checked, Google is using Chrome 54. I have a few websites without SSR, there haven't been any problems with indexing.
But yes, if your site has thousands of pages, SSR can improve indexing speed
source: developers.google.com/search/docs/...
Oops! sorry, somewhere I read it's 54. My mistake!
I'll correct it