It goes beyond Svelte, SEO is a concern no matter what fancy js tool you use. You should never forget that.
Even if Svelte's ecosystem is still young, you'll find many resources to boost your project and save time. But how do you handle SEO? Before we can answer that question, we need to explore essential concepts.
We won't see how to super-boost your ranking. The idea is not to harm indexation and understand what's at stake.
The rendering engine parses your documents and displays the parsed content on the screen.
There are various rendering engines. WebKit is probably the most popular.
Before you can see something on the screen, there are many steps (layers) and calculations. That is why you need to optimize many things, including stylesheets and scripts, to optimize global rendering and prevent any render-blocking issue.
You know, simplicity is a feature. If something is too complicated, it's rarely the right way. Keeping things stupid and simple requires expertise and accuracy.
With js websites, the rendering can be excellent. The difference is that you don't use any server but the browser to render your project. I know it's way more subtle than that, but let's keep it basic.
As a result, you get more interactivity (and reactivity ^^) and tones of great features.
It defers js rendering because such a rendering has a high cost. It needs resources to compute things, but resources are not infinite, so it has a queue mechanism. Likewise, the queue is not infinite, so GoogleBot cannot render everything every day.
The bot might index your content after several days or even weeks. Google calls those steps the initial and second waves of indexing.
Roughly speaking, you need a server this time, but all contents are directly indexable with SSR.
A lot of frameworks, such as Nextjs, follow that process. The caveat is that every single request makes the server work all over again.
Besides, it often has a high infrastructure cost.
It's a workaround for Googlebot and other crawlers.
The server reads the CSR contents and sends a simplified version to search engines and crawlers while humans still get the CSR part. There are some caveats too.
It requires a lot of resources, and you have to detect crawlers precisely. There are some tools for that, but it's not easy to set up and maintain correctly.
As a result, it acts like SSR, and you can still leverage the benefits of CSR.
If you look at the basic source code provided by the Svelte template, you might be afraid:
There are frameworks built upon Svelte that bring kick-ass features, including SEO (e.g., Sapper). I won't talk about them specifically, but please have a look at them. You could save a lot of time.
If you prefer handling that, you can start with the head section using Svelte head. This element allows for adding stuff in the head section, so in your
.svelte file, you can do the following:
<svelte:head> <!-- your meta here --> </svelte:head>
Once you have a robust
<head>, it's relatively easy to add routing, for example, with the Svelte routing package or with any framework powered by Svelte.
I strongly recommend using prerendering techniques, especially if you have a lot of content and pages.
Here is what Netlify says about prerendering :
If you don't know how to start prerendering, some hosts are quite useful for that. For example, Netlify has a beta feature called "prerendering" in the post-processing options. Please enable it and enjoy \o/.
There are efficient external services for too, such as prerender.io.
To test if everything works fine, you can do simple things like:
curl -A Googlebot https://myawesome-svelte-website.com
It will give you what Googlebot gets, but be aware Google has cached versions of your pages. It's also a good idea to watch it with the Google Search Console.
Svelte is impressive—High-trafic websites such as Spotify or the New York Times use Svelte in production. However, be extra careful with the SEO part when migrating from any other tool or starting a new fantastic project.