DEV Community

mhmd-elsaid
mhmd-elsaid

Posted on • Updated on

Create dynamic sitemap in Next.js app

Image description

SEO, or Search Engine Optimization, means setting up your website and content to show up through online search results. While many marketing tactics rely on you reaching out to your audience, SEO gives you the power to reach people when they are actively searching out information related to your products and services.

Steps to create Dynamic Sitemap File

  • Firstly we have to install next-sitemap.
  • Create next-sitemep.js file on the next.config file level.
  • Define your base url by adding this code in next-sitemap.config.js file to read website urls .
const siteUrl = "https://your-host"

module.exports = {
    siteUrl
}
Enter fullscreen mode Exit fullscreen mode

Now build the app and will find that sitemap.xml file will be automatically created with your static routes.

To auto generate robots.txt file we just need to add generateRobotsTxt: true to the exports object in
next-sitemap.config.js file , so next-sitemap.config.js file after updating will be like following :

const siteUrl = "https:your-host"

module.exports = {
    siteUrl,
    generateRobotsTxt: true
}
Enter fullscreen mode Exit fullscreen mode

To make sure it will regenerated each time we build the app add the following two lines to .gitignore file

/public/sitemap.xml
/public/robots.xml
Enter fullscreen mode Exit fullscreen mode

What if we need to exclude some pages from being added to sitemap ?
we can do it by updating next-sitemap.config.js file and adding exclude array to the exports object.

const siteUrl = "https:you-host"

module.exports = {
    siteUrl,
    generateRobotsTxt: true
    exclude: ["/<page-url>"]
} 
Enter fullscreen mode Exit fullscreen mode

And to disallow those pages in robots.txt file, we can do it using robotsTxtOptions object .

So next-sitemap.config.js file will be like following after adding pages we need to disallow .

const siteUrl = "https:yuor-host";

module.exports = {
  siteUrl,
  generateRobotsTxt: true,
  robotsTxtOptions: {
    policies: [
      { userAgent: "*", disallow: "/<page-url>" },
      { userAgent: "*", allow: "/" },
    ],
  },
  exclude: ["/<page-url>"],
};
Enter fullscreen mode Exit fullscreen mode

In the above steps we had created a dynamic sitemap for the static pages we have ,[ex: example.com , example.com/about , … etc] .

This is great but what about the dynamic rendering pages ? [like : example.com/posts/[post_id] , … etc ] .

The problem is that we don’t have params in those urls.

So what should we do to overcome this problem ?

  • Firstly we have to get all dynamic pages urls .
  • Then we can create another sitemap for those pages .

let’s do it 🥷

create a folder called server-sitemap.xml in the Pages folder & create an index.js file inside.

In the index.js file we will use getServerSideProps to get urls for dynamic pages .

import Axios from "axios";

export async function getServerSideProps(context) {
  const response = await Axios.get(
    `base_url/<data>`,
  );
  const data = await response?.data;
  return {props:{data}};
}

export default function Site() {}

Enter fullscreen mode Exit fullscreen mode

then we will create an array of objects fields which contains {loc : <page url> ,lastmod: <time now>}.

import Axios from "axios";

export async function getServerSideProps(context) {
  const response = await Axios.get(
    `base_url/<data>`,
  );
  const data = await response?.data;
  const fields = data?.map((item) => ({
    loc: `https://your-host/${item?.url}`,
    lastmod: new Date().toISOString(),
  }));
  return {props:{data}};
}

export default function Site() {}
Enter fullscreen mode Exit fullscreen mode

Then we will import getServerSideSitemap function from next-sitemap and use it to generate sitemap links and return the result from getServerSideProps

import Axios from "axios";
import { getServerSideSitemap } from "next-sitemap";

export async function getServerSideProps(context) {
  const response = await Axios.get(
    `base_url/<data>`,
  );
  const data = await response?.data;
  const fields = data?.map((item) => ({
    loc: `https://your-host/${item?.url}`,
    lastmod: new Date().toISOString(),
  }));
  return getServerSideSitemap(context, fields);
}

export default function Site() {}
Enter fullscreen mode Exit fullscreen mode

Great , now we had created the server dynamic sitemap & if we go to the url host/server-sitemap.xml we will find that our sitemap includes fetched urls as expected .

Last thing to do here is to inform google crawler that it’s not only one sitemap ,but two .
And this will be done by adding additionalSitemaps key to robotsTxtOptions object in next-sitemap.config.js file and add our two sitemaps [static one that we have created before for static routes & server side one that we created for URLs with params].

const siteUrl = "https://your-host";

module.exports = {
  siteUrl,
  generateRobotsTxt: true,
  robotsTxtOptions: {
    policies: [
      { userAgent: "*", disallow: "/page-url" },
      { userAgent: "*", allow: "/" },
    ],
    additionalSitemaps: [
      `${siteUrl}/sitemap.xml`,
      `${siteUrl}/server-sitemap.xml`,
    ],
  },
  exclude: ["/page-url"],
};
Enter fullscreen mode Exit fullscreen mode

That’s all , now we had created a dynamic sitemap for static pages & dynamic sitemap for dynamic rendering pages .

Top comments (0)