re: Let's talk SEO, 10 tips you should know VIEW POST

FULL DISCUSSION
 

Question: React sites? (or any other js-generated site)

Google "says" there is not a problem but empirically I think it is the opposite.

btw, I should add:

  • Google Webmaster (and Bing Webmaster)
  • Google Analytics.
 

Good question, considering most of these tools work with sitemaps and static HTML files, you should definitely do something. It won't happen for you magically, so SSR, or generating static pages linked to sitemaps are some options there

 

There is no problem with well rendered HTML. It doesn't matter the way it is produced mostly. Goo has never read JS well.

If the content is not on the page & obscured by JS (pulled in later/not html etc) then it is not likely to be visible to bots.

Do use GW. Use something else instead of GA (privacy violations, currently being investigated, possible antitrust).

 

GA and GW ... can provide some links bout them? Don't know what those are, never heard of them.

Sorry, Goo Analytics & Webmasters. Short hand.

 
 

Google says that they could crawl a site exactly like any other browser however it is not true and we have empiric results that it shows the opposite. Google indexes a site with JS but it doesn't works at fullest.

Google could indeed a javascript-generate site but Google said (officially) it takes more works. youtube.com/watch?v=PFwUbgvpdaQ

Also, Google could misunderstand a site, such as a site that it's cheating but Google doesn't tell you about it.

Also, Google recommends using schema/noscript for lazy loading images and it hurts SEO. But Google also says that lazy loading is ok as is, so it is misleading.

So Google is full of half lies, especially since they are shuffling algorithms but they are not telling us about it.

I believe that due to the fact that Google's algorithms are closed, in terms of SEO, we can either rely on own/others' experiments, or on the statements of Google's stuff.

 

I've done some experiments and monitor it using Google Search Console, and it seems my VueJS based app was crawled correctly by using prerendering and redirecting bots to my static prerendered version of my website. The big advantage is that you have nothing to change for your end-user, disadvantage is that you have to do it :)

code of conduct - report abuse