DEV Community

Cover image for Nextjs robots txt location
Bojan Stanojevic
Bojan Stanojevic

Posted on • Edited on

Nextjs robots txt location

One of the main reasons developers use popular JavaScript frameworks is the simplicity and speed to get a website up and running quickly which is also SEO optimized. But then, when you need to configure very basic things like sitemap, robots.txt or any script, turns out you need to investigate a bit to figure out how to do it. I've noticed people asking about the robots.txt location for Next.js so I thought I'd write a short post about it.

At least for setting up this file, this is very straightforward. All you have to do, if you're using Next.js version 13 or bigger and App router is to go to the App folder which is the root of your app, and inside of it create a new file called robots.txt

The most basic robots.txt looks like this:



User-agent: *
Allow: /
Sitemap: https://YOUR_DOMAIN_NAME.com/YOUR_SITEMAP.xml              


Enter fullscreen mode Exit fullscreen mode
  • User-agent: This directive specifies the web crawler to which the rule applies.
  • *: The asterisk * is a wildcard that represents "all" user agents. In other words, any web crawler or bot, regardless of its name or origin, should follow the rules defined after this line.
  • Allow: /: This explicitly allows web crawlers to access the entire website, starting from the root directory.
  • The Sitemap: directive in a robots.txt file is used to provide the location of your website's XML sitemap to search engine crawlers.

Inside the robots.txt file you can also define certain routes that should not be crawled with this directive, for example:



Disallow: /admin/
Disallow: /private/


Enter fullscreen mode Exit fullscreen mode

It's not really clear to me why the Next.js team didn't include at least a basic robots.txt file, from Next.js docs they also explain how to generate dynamic robots.js/robots.ts file, but i think static robots.txt is sufficient for most of the websites out there.

Since we are talking about robots.txt I thought I'd mention sitemaps as well. Like robots.txt, sitemap.xml is also located at the root of your project inside the App folder and is where important for SEO and your website rankings. To generate a static sitemap, deploy your website and use some of the many free services online. I can recommend ScreamingFrog application that includes a sitemap generator tool that I use most often. From my experience this is the best tool to get a sitemap for smaller websites, especially when you are working with a JavaScript project, for larger projects, a better idea is to generate a sitemap using code.. When you get your xml file just upload it to the App folder and deploy your app. Finally, once you have your sitemap location you should upload that URL to Google Search Console inside the Sitemaps section to get your website indexed quicker.

If you need any help with your Next.js or SEO project we can work together, learn more here.

If you enjoyed this post I'd love it if you could give me a follow on Twitter by clicking on the button below! :)
Dellboyan Twitter

Top comments (0)