← Back

Set Up Robots.txt Categories for Your Cloudflare Website

Overview

Use a simple Cloudflare Worker to append Robots.txt Categories to your existing robots.txt file. Please contact us if you need help getting set up.

Step 1: Create a Worker

const DARK_VISITORS_ACCESS_TOKEN = "YOUR_ACCESS_TOKEN" // TODO: Swap in your access token
const ROBOTS_TXT_DISALLOW_PATH = "/"
const ROBOTS_TXT_AGENT_TYPES = [
    // TODO: Add blocked agent types
]
        
export default {
    async fetch(request, env, ctx) {
        const [thisResponse, thatResponse] = await Promise.all([
            fetch(request),
            fetchRobotsTXT()
        ])

        const [thisRobotsTXT, thatRobotsTXT] = await Promise.all([
            thisResponse.ok ? thisResponse.text() : "",
            thatResponse.ok ? thatResponse.text() : ""
        ])

        const robotsTXT = [
            thisRobotsTXT.trim(),
            "# BEGIN Dark Visitors Managed Content",
            thatRobotsTXT.trim(),
            "# END Dark Visitors Managed Content",
        ].join("\n\n")

        return new Response(robotsTXT, {
            headers: {
                "Content-Type": "text/plain"
            },
        })
    },
}

async function fetchRobotsTXT() {
    return fetch("https://api.darkvisitors.com/robots-txts", {
        method: "POST",
        headers: {
            "Authorization": `Bearer ${DARK_VISITORS_ACCESS_TOKEN}`,
            "Content-Type": "application/json",
        },
        body: JSON.stringify({
            agent_types: ROBOTS_TXT_AGENT_TYPES,
            disallow: ROBOTS_TXT_DISALLOW_PATH,
        }),
    })
}

Step 2: Test Your Integration

If your website is correctly connected, you should see the new rules in your website's robots.txt.