← Back

Serve an Automatic Robots.txt From Your Backend

Overview

Use the REST API to generate and serve an automatically updating robots.txt from your website's backend, in any programming language. Please contact us if you need help.

Step 1: Generate Your Robots.txt

Make an HTTP request to generate a new robots.txt. The response is a robots.txt string in text/plain format.

URL
URL https://api.darkvisitors.com/robots-txts
HTTP Method POST
Headers
Authorization A bearer token with your project's access token (e.g. Bearer YOUR_ACCESS_TOKEN). You can get your project's access token by navigating to the Projects page, opening your project, and opening its settings page.
Content-Type This needs to be set to application/json
Body
agent_types An array of agent types you want to block or set a rule for. Allowed agent types include:
  • AI Agent
  • AI Assistant
  • AI Data Scraper
  • AI Search Crawler
  • Archiver
  • Developer Helper
  • Fetcher
  • Headless Agent
  • Intelligence Gatherer
  • Scraper
  • SEO Crawler
  • Search Engine Crawler
  • Security Scanner
  • Undocumented AI Agent
  • Uncategorized
disallow A string specifying which URLs are disallowed. Defaults to / to disallow all URLs.

Example

curl -X POST https://api.darkvisitors.com/visits \
-H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
-H "Content-Type: application/json" \
-d '{
        "agent_types": [
            "AI Data Scraper",
            "Scraper",
            "Intelligence Gatherer",
            "SEO Crawler",
        ],
        "disallow": "/"
    }'

Step 2: Serve Your Robots.txt

Generate a robots.txt periodically (e.g. once per day), then cache and serve it from your website's /robots.txt endpoint.