Installing the WordPress plugin is easy and only takes a few seconds.
- Log in to your website's WordPress dashboard
- Click Plugins in the sidebar
- Search for "Dark Visitors" or download the plugin directly
- Click Install Now
- Click Activate
- Click Dark Visitors in the sidebar
- Paste your access token
- Select the agent types you want to block
- Click Save Changes
Enforce Your Robots.txt Rules
The WordPress plugin can actually block agents who try to ignore your robots.txt rules when you upgrade your plan.
The Node.js package and docs are available on NPM.
Call the Robots.txt API to generate a new robots.txt. Do this periodically (e.g. once per day), then cache and serve the result.
Endpoint |
URL |
https://api.darkvisitors.com/robots-txts |
HTTP Method |
POST |
Headers |
Authorization |
A bearer token with your project's access token (e.g. Bearer 48d7dcbd-fc44-4b30-916b-2a5955c8ee42 ). |
Content-Type |
This needs to be set to application/json |
Body |
agent_types |
An array of agent types. Agent types include AI Assistant , AI Data Scraper , AI Search Crawler , and Undocumented AI Agent . |
disallow |
A string specifying which URLs are disallowed. Defaults to / to disallow all URLs. |
The response body is a robots.txt in text/plain
format. You can use this as is, or append additional lines to include things like sitemap directives. Cache and serve this as your website's robots.txt.
This cURL example generates a robots.txt that blocks all known AI data scrapers and undocumented AI agents from all URLs.
curl -X POST https://api.darkvisitors.com/robots-txts \
-H "Authorization: Bearer ${ACCESS_TOKEN}" \
-H "Content-Type: application/json" \
-d '{
"agent_types": [
"AI Data Scraper",
"Undocumented AI Agent"
],
"disallow": "/"
}'