1) Create a robots.txt file in the templates > app folder
User-Agent: [name of search engine crawler]
Disallow: [disallowed URL]
Disallow: [disallowed URL]
Sitemap: https://domain.com/sitemap.xml
2) Specify the User-Agent (i.e. Googlebot, Binngbot, Slurp)
Use an * to specify all user agents.
User-Agent: *
3) Disallow URLs or directories
User-Agent: *
Disallow: /page1
Disallow: /directory/
4) Allow URLs
User-Agent: *
Disallow: /directory/
Allow: /directory/page
5) Add the sitemap
User-Agent: *
Disallow: /directory/
Disallow: /page1
Disallow: /page2
Sitemap: https://domain.com/sitemap.xml
6) Add the robots.txt to the urls.py
from django.urls import path
from . import views
from django.contrib.sitemaps.views import sitemap
from .sitemaps import ArticleSitemap
from django.views.generic.base import TemplateView #import TemplateView
app_name = "main"
sitemaps = {
'blog':ArticleSitemap
}
urlpatterns = [
path("", views.homepage, name="homepage"),
path('sitemap.xml', sitemap, {'sitemaps': sitemaps}, name='django.contrib.sitemaps.views.sitemap'),
path("robots.txt",TemplateView.as_view(template_name="main/robots.txt", content_type="text/plain")), #add the robots.txt file
]
Top comments (0)