DEV Community

Cover image for Create a Dynamic Sitemap with Next.js
Temp-insta
Temp-insta

Posted on

Create a Dynamic Sitemap with Next.js

To improve your Search Engine Optimization (SEO), you might need to add a sitemap or robots.txt file to your Next.js site.

A sitemap defines the relationship between pages of your site. Search engines utilize this file to more accurately index your site. You can also provide additional information such as last updated time, how frequently the page changes, and more.

A robots.txt file tells search engines which pages or files the crawler can or can't request from your site.

Static Sitemaps

If your site does not update frequently, you might currently have a static sitemap. This is a basic .xml file defining the content of your site.

As your site scales, you will probably want to create your sitemap dynamically.

Dynamic Sitemaps

If your site frequently changes, you should dynamically create a sitemap. Let's first look at an example where your site content is file-based (e.g., contained inside the /pages directory).

First, let's add globby so we can fetch a list of routes.

Note: globby might not work on Windows.
Next, we can create a Node script at scripts/generate-sitemap.mjs. This file will dynamically build a sitemap based on your /pages directory.

Finally, add postbuild script in your package.json to run this script after next build completes. Your generated file gets created at public/sitemap.xml which is then accessible from the root of your site.

External Content
If you have externally hosted data (e.g., a CMS), you'll need to make an API request before you can create your sitemap. This implementation will vary depending on your data source, but the idea is the same. To demonstrate, I've created an example using placeholder data.

First, create a new file at pages/sitemap.xml.js

When the route /sitemap.xml is initially loaded, we will fetch posts from an external data source and then write an XML file as the response.

**

robots.txt

**
Finally, we can create a static file at public/robots.txt to define which files can be crawled and where the sitemap is located.

Read The Complete article here:- https://www.epicprogrammer.com/2022/01/create-dynamic-sitemap-with-nextjs.html

Top comments (0)