DEV Community

loading...
Cover image for Add Robots.txt to your Django Project

Add Robots.txt to your Django Project

Ordinary Coders
Coding can be difficult. Get motivated by learning to code with interactive, step-by-step tutorials.
Originally published at ordinarycoders.com ・1 min read

1) Create a robots.txt file in the templates > app folder

User-Agent: [name of search engine crawler]
Disallow: [disallowed URL]
Disallow: [disallowed URL]
Sitemap: https://domain.com/sitemap.xml
Enter fullscreen mode Exit fullscreen mode

2) Specify the User-Agent (i.e. Googlebot, Binngbot, Slurp)
Use an * to specify all user agents.

User-Agent: *
Enter fullscreen mode Exit fullscreen mode

3) Disallow URLs or directories

User-Agent: *
Disallow: /page1
Disallow: /directory/
Enter fullscreen mode Exit fullscreen mode

4) Allow URLs

User-Agent: *
Disallow: /directory/
Allow: /directory/page
Enter fullscreen mode Exit fullscreen mode

5) Add the sitemap

User-Agent: *
Disallow: /directory/
Disallow: /page1
Disallow: /page2
Sitemap: https://domain.com/sitemap.xml
Enter fullscreen mode Exit fullscreen mode

6) Add the robots.txt to the urls.py

from django.urls import path
from . import views
from django.contrib.sitemaps.views import sitemap
from .sitemaps import ArticleSitemap
from django.views.generic.base import TemplateView #import TemplateView
app_name = "main"
sitemaps = {
    'blog':ArticleSitemap
}
urlpatterns = [
    path("", views.homepage, name="homepage"),
    path('sitemap.xml', sitemap, {'sitemaps': sitemaps}, name='django.contrib.sitemaps.views.sitemap'),
    path("robots.txt",TemplateView.as_view(template_name="main/robots.txt", content_type="text/plain")),  #add the robots.txt file
]
Enter fullscreen mode Exit fullscreen mode

Detailed Tutorial

Discussion (0)

Forem Open with the Forem app