Let's talk SEO, 10 tips you should know

Yaser Adel Mehraban on June 11, 2019

For most companies, ranking #1 is like a blessing to their business. However, most of the web developers are not aware of what should be done to re... [Read Full]
markdown guide

Question: React sites? (or any other js-generated site)

Google "says" there is not a problem but empirically I think it is the opposite.

btw, I should add:

  • Google Webmaster (and Bing Webmaster)
  • Google Analytics.

Good question, considering most of these tools work with sitemaps and static HTML files, you should definitely do something. It won't happen for you magically, so SSR, or generating static pages linked to sitemaps are some options there


I've done some experiments and monitor it using Google Search Console, and it seems my VueJS based app was crawled correctly by using prerendering and redirecting bots to my static prerendered version of my website. The big advantage is that you have nothing to change for your end-user, disadvantage is that you have to do it :)


Google says that they could crawl a site exactly like any other browser however it is not true and we have empiric results that it shows the opposite. Google indexes a site with JS but it doesn't works at fullest.

Google could indeed a javascript-generate site but Google said (officially) it takes more works. youtube.com/watch?v=PFwUbgvpdaQ

Also, Google could misunderstand a site, such as a site that it's cheating but Google doesn't tell you about it.

Also, Google recommends using schema/noscript for lazy loading images and it hurts SEO. But Google also says that lazy loading is ok as is, so it is misleading.

So Google is full of half lies, especially since they are shuffling algorithms but they are not telling us about it.

I believe that due to the fact that Google's algorithms are closed, in terms of SEO, we can either rely on own/others' experiments, or on the statements of Google's stuff.


There is no problem with well rendered HTML. It doesn't matter the way it is produced mostly. Goo has never read JS well.

If the content is not on the page & obscured by JS (pulled in later/not html etc) then it is not likely to be visible to bots.

Do use GW. Use something else instead of GA (privacy violations, currently being investigated, possible antitrust).


GA and GW ... can provide some links bout them? Don't know what those are, never heard of them.

Sorry, Goo Analytics & Webmasters. Short hand.


This is a good list Yaser, always detailed & clearly laid out for people to understand.

Most tips are onpage SEO specifically. Some of these are not really SEO, but useful for traffic gen or making a nice site (twitter cards are not SEO).

Let me add a couple of things.

  1. Quality content (this was always Goo & MC's #1 point).
  2. Page count/page indexing rate (number of pages indexed is highly correlated to traffic levels).
  3. Inbound links/ anchor text (outbound links do nothing for SEO on their own).
  4. Performance.
  5. Text must appear on page (they have been back & forth a few times on this since the miserable failure goo bomb).
  6. Content distinction (if most pages are basically too similar).

Matt Mullenweg (WP) stated, a long time ago in a video interview, that one of the best SEO tips for wordpress.com was removing the meta desc. A 30% bump or something. Results vary.


Matt Mullenweg (WP) stated, a long time ago in a video interview, that one of the best SEO tips for wordpress.com was removing the meta desc. A 30% bump or something.

That's pretty interesting, any chance you can provide a link (or some keywords I can search) to find that reference?


It was a long time ago (many years), I think it was about wp (or wordcamp) & the interviewer asks for one unusual SEO/traffic tip or something.

It came down to "let goo choose some desc keywords from your content" pretty much. Meta desc doesn't do much either way really for 99%. Snippet perhaps, tiny difference.

Looking at wp (org/com) they use them right now. I do too. It's testable.

Not intentionally being vague. If I come across it, I'll drop it here.


Awesome list! Was wondering if anyone has come across a good CMS that handles a lot of this for you? Currently I'm thinking about migrating our blog to a CMS and want to make it as easy as possible for editors to write content but automating the SEO pieces of the content.


I am using Gatsby for my own blog and most of these are in place by default (the starter I chose), but not sure what you mean by CMS for blog


Thinking more for a company blog that would have multiple authors where each author may not have all the SEO knowledge to write in the proper meta data.

That can be achieved using anything, I having some guest posts on my log where the author name is not me, but that just in the frontmatter of the markdown file. Simple as that


I generally use Wordpress as it has been very battle tested SEO wise.

The majority of the SEO industry used it, many of the big info-sites now started with it. It has a lot of features (inc plugins/self-code) that help publishing a lot & a bunch of things you can do to get extra results. You can do this with any code/CMS but its been done millions of times already with WP. Handles 100K+ uniques a day np, not many ppl go above that.


Hi, just going to complete reading your post, but I have a question I have this clients coming in for me and they are expecting me to know SEO so I'm reading and going through moz.com/beginners-guide-to-seo this,

Please recommend some resources from start that will help me get good at it. I'm a junior developer with work in Node, Express, React, Redux, server side rendering with Next.js (which also mentions the good part of the SEO which I completely don't get so yeah please share some insight to begin from scratch and get good/pro I know its subjective but yeah why not)


What about JSON-LD? It doesn't work like Open Graph, in fact, it does not help you when you share content through Fb, but I think it does help you with SEO. Why is not in the list? Is it because you wanted to keep it simple with just HTML?

Also, what's the difference between robots.xml file and that robots tag that you mention? Yeah, newbie question xD


Yes this was simply using normal HTML tags which is easy to follow, for people who are developing everyday and are not aware of these.

I am not sure what you mean by robots.xml, I know of robots.txt and the difference is that you can control more than one page in the txt file, the tag only applies to current page 😊


Great article! I didn’t know about the canonical links. Very cool. Thank you.


I've built my blog to 60k+ page visits per month, and I didn't focus on any of these tips.

The single most important factor in SEO is to write content that people are searching for.

If you nail that, nothing else really matters all that much.


Yaser Thanks for great tips! I used your tips for my website and got postive results! You are great man!

Mushahid Hussain
Website: mushahidhussain.info/

code of conduct - report abuse