Since Next.js has become the best option (in my opinion) to create serverless apps with React, I will start a new group of articles talking about how to perform SEO with this great library.
This first one is a very simple recipe to add sitemap.xml and robots.txt files. As almost every body knows, these files are used by the Google Search Bot in order to know the site structure and the files that should list.
In order to don't extend much the post, I'll show you only the static files. Anyway, you can transform these files into dynamic ones by fetching previously and passing to getSitemap and getRobots methods the data you want to put in.
pages/sitemap.xml.tsx
import React from 'react';
const getSitemap = () => `<?xml version="1.0" encoding="utf-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
<loc>https://example.com/</loc>
<lastmod>2020-07-01</lastmod>
<changefreq>daily</changefreq>
<priority>0.9</priority>
</url>
</urlset>`;
class Sitemap extends React.Component {
public static async getInitialProps({ res }) {
res.setHeader('Content-Type', 'text/xml');
res.write(getSitemap());
res.end();
}
}
export default Sitemap;
pages/robots.txt.tsx
import React from 'react';
const getRobots = () => `User-agent: *
Disallow: /_next/static/
`;
class Sitemap extends React.Component {
public static async getInitialProps({ res }) {
res.setHeader('Content-Type', 'text/plain');
res.write(getRobots());
res.end();
}
}
export default Sitemap;
Top comments (3)
I guess just
robots.txt
in the public directory would work, won't it?I guess this works only in a dynamic page approach, no possibility to export this as a static page.
Not working