← Back
Set Up Robots.txt Categories for Your Python Backend
Overview
Use the REST API to connect your Python website to Robots.txt Categories in just a few seconds. Please contact us if you need help getting set up.
Step 1: Generate Your Robots.txt
Define a function that makes an HTTP request to the REST API with your project's access token. Select which AgentTypes you want to block, and a string specifying which URLs are disallowed (e.g. "/" to disallow all paths). Allowed agent types include:
AI AgentAI AssistantAI Data ScraperAI Search CrawlerArchiverDeveloper HelperFetcherAutomated AgentIntelligence GathererScraperSEO CrawlerSearch Engine CrawlerSecurity ScannerUndocumented AI AgentUncategorized
Paste in this code:
async def generate_dark_visitors_robots_txt() -> str:
async with aiohttp.ClientSession() as session:
try:
async with session.post(
"https://api.darkvisitors.com/robots-txts",
headers={
"Authorization": "Bearer YOUR_ACCESS_TOKEN",
"Content-Type": "application/json",
},
json={
"agent_types": [
"AI Data Scraper",
"Scraper",
"Intelligence Gatherer",
"SEO Crawler",
],
"disallow": "/",
},
) as response:
response.raise_for_status()
return await response.text()
except aiohttp.ClientResponseError as error:
raise RuntimeError(f"Invalid response code fetching robots.txt: {error.status}") from error
except aiohttp.ClientError as error:
raise RuntimeError(f"Error fetching robots.txt: {error}") from error
- Navigate to the Dark Visitors Projects page and open your project
- Copy your access token from the Settings page
- Back in your code, swap in your access token where it says
YOUR_ACCESS_TOKEN
Here's how to use it:
robots_txt = asyncio.run(generate_dark_visitors_robots_txt())
The return value is a plain text robots.txt string.
Step 2: Serve Your Robots.txt
Generate a robots_txt periodically (e.g. once per day), then cache and serve it from your website's /robots.txt endpoint.