← Back
Serve an Automatic Robots.txt From Your Cloudflare Website (Using a Worker)
Overview
This integration uses a simple Cloudflare Worker to append Dark Visitors automatic robots.txt rules to your existing robots.txt file. Please contact us if you need help.
Step 1: Create a Worker
- Open your domain in your Cloudflare dashboard
- In the sidebar, click Workers & Pages under Compute (Workers)
- Click the Create application button
- Select the Start with Hello World! option
- Name the worker (something like
dark-visitors-robots-txt) and click Deploy - In the top right, click the
Edit codebutton - Paste this code into the
worker.jsfile:
export default {
async fetch(request, env, ctx) {
const [thisResponse, thatResponse] = await Promise.all([
fetch(request),
fetchRobotsTXT()
])
const [thisRobotsTXT, thatRobotsTXT] = await Promise.all([
thisResponse.ok ? thisResponse.text() : "",
thatResponse.ok ? thatResponse.text() : ""
])
const robotsTXT = [
thisRobotsTXT.trim(),
"# BEGIN Dark Visitors Managed Content",
thatRobotsTXT.trim(),
"# END Dark Visitors Managed Content",
].join("\n\n")
return new Response(robotsTXT, {
headers: {
"Content-Type": "text/plain"
},
})
},
}
async function fetchRobotsTXT() {
return fetch("https://api.darkvisitors.com/robots-txts", {
method: "POST",
headers: {
"Authorization": "Bearer YOUR_ACCESS_TOKEN",
"Content-Type": "application/json",
},
body: JSON.stringify({
agent_types: [
// TODO: Add blocked agent types
],
disallow: "/",
}),
})
}
- Navigate back to the Dark Visitors Projects page and open your project
- Copy your access token from the Settings page
- Back in Cloudflare, swap in your access token where it says
YOUR_ACCESS_TOKEN - Where it says
// TODO: Add blocked agent types, add the agent types you want to block, and a string specifying which URLs are disallowed (e.g."/"to disallow all paths). Allowed agent types include:"AI Agent""AI Assistant""AI Data Scraper""AI Search Crawler""Archiver""Developer Helper""Fetcher""Headless Agent""Intelligence Gatherer""Scraper""SEO Crawler""Search Engine Crawler""Security Scanner""Undocumented AI Agent""Uncategorized"
- Click Deploy
- Navigate back to your Cloudflare dashboard and select your domain
- In the sidebar, click Workers Routes
- Click Add route in the HTTP Routes section
- Under Route, enter your domain followed by
/robots.txt(e.g.example.com/robots.txt) and select the worker you just created - Click Save
Step 2: Test Your Integration
If your website is correctly connected, you should see the new rules in your website's robots.txt.