← Back
Serve an Automatic Robots.txt From Your Python Backend
Overview
Use the REST API to generate and serve an automatically updating robots.txt from your website's Python backend. Please contact us if you need help.
Step 1: Generate Your Robots.txt
Define a function that makes an HTTP request to the REST API with your project's access token. Select which AgentType
s you want to block, and a string specifying which URLs are disallowed (e.g. "/"
to disallow all paths). Allowed agent types include:
AI Agent
AI Assistant
AI Data Scraper
AI Search Crawler
Archiver
Developer Helper
Fetcher
Headless Agent
Intelligence Gatherer
Scraper
SEO Crawler
Search Engine Crawler
Security Scanner
Undocumented AI Agent
Uncategorized
Paste in this code:
async def generate_dark_visitors_robots_txt() -> str:
async with aiohttp.ClientSession() as session:
try:
async with session.post(
"https://api.darkvisitors.com/robots-txts",
headers={
"Authorization": "Bearer YOUR_ACCESS_TOKEN",
"Content-Type": "application/json",
},
json={
"agent_types": [
"AI Data Scraper",
"Scraper",
"Intelligence Gatherer",
"SEO Crawler",
],
"disallow": "/",
},
) as response:
response.raise_for_status()
return await response.text()
except aiohttp.ClientResponseError as error:
raise RuntimeError(f"Invalid response code fetching robots.txt: {error.status}") from error
except aiohttp.ClientError as error:
raise RuntimeError(f"Error fetching robots.txt: {error}") from error
- Then, navigate to the Projects page and open your project
- Copy your access token from the Settings page
- Back in your code, swap in in your access token where it says
YOUR_ACCESS_TOKEN
Here's how to use it:
robots_txt = asyncio.run(generate_dark_visitors_robots_txt())
The return value is a plain text robots.txt string.
Step 2: Serve Your Robots.txt
Generate a robots_txt
periodically (e.g. once per day), then cache and serve it from your website's /robots.txt
endpoint.