-
Notifications
You must be signed in to change notification settings - Fork 0
Open
Description
We need to add robots.txt to control web crawlers and prevent unnecessary indexing. This slightly reduces the server load caused by aggressive crawlers.
For FastAPI, in main.py
@app.get("/robots.txt", response_class=PlainTextResponse)
def robots_txt():
return "User-agent: *\nDisallow: /"
For Django, in urls.py
def robots_txt(request):
return HttpResponse("User-agent: *\nDisallow: /", content_type="text/plain")
urlpatterns = [
path("robots.txt", robots_txt),
]
Metadata
Metadata
Assignees
Labels
No labels