← Back

Generate Robots.txt Category Rules With the REST API

Overview

Use the REST API to connect your website to Robots.txt Categories in just a few seconds. Please contact us if you need help getting set up.

Step 1: Generate Your Robots.txt

Make an HTTP request to generate a new robots.txt. The response is a robots.txt string in text/plain format.

URL
URL https://api.darkvisitors.com/robots-txts
HTTP Method POST
Headers
Authorization A bearer token with your project's access token (e.g. Bearer YOUR_ACCESS_TOKEN). You can get your project's access token by navigating to the Dark Visitors Projects page, opening your project, and opening its settings page.
Content-Type This needs to be set to application/json
Body
agent_types An array of agent types you want to block or set a rule for. Allowed agent types include:
  • AI Agent
  • AI Assistant
  • AI Data Scraper
  • AI Search Crawler
  • Archiver
  • Developer Helper
  • Fetcher
  • Automated Agent
  • Intelligence Gatherer
  • Scraper
  • SEO Crawler
  • Search Engine Crawler
  • Security Scanner
  • Undocumented AI Agent
  • Uncategorized
disallow A string specifying which URLs are disallowed. Defaults to / to disallow all URLs.

Example

curl -X POST https://api.darkvisitors.com/visits \
-H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
-H "Content-Type: application/json" \
-d '{
        "agent_types": [
            "AI Data Scraper",
            "Scraper",
            "Intelligence Gatherer",
            "SEO Crawler",
        ],
        "disallow": "/"
    }'

Step 2: Serve Your Robots.txt

Generate a robots.txt periodically (e.g. once per day), then cache and serve it from your website's /robots.txt endpoint.