← Back

Serve an Automatic Robots.txt From Your Cloudflare Website (Using a Worker)

Overview

This integration uses a simple Cloudflare Worker to append Dark Visitors automatic robots.txt rules to your existing robots.txt file. Please contact us if you need help.

Step 1: Create a Worker

export default {
    async fetch(request, env, ctx) {
        const [thisResponse, thatResponse] = await Promise.all([
            fetch(request),
            fetchRobotsTXT()
        ])

        const [thisRobotsTXT, thatRobotsTXT] = await Promise.all([
            thisResponse.ok ? thisResponse.text() : "",
            thatResponse.ok ? thatResponse.text() : ""
        ])

        const robotsTXT = [
            thisRobotsTXT.trim(),
            "# BEGIN Dark Visitors Managed Content",
            thatRobotsTXT.trim(),
            "# END Dark Visitors Managed Content",
        ].join("\n\n")

        return new Response(robotsTXT, {
            headers: {
                "Content-Type": "text/plain"
            },
        })
    },
}

async function fetchRobotsTXT() {
    return fetch("https://api.darkvisitors.com/robots-txts", {
        method: "POST",
        headers: {
            "Authorization": "Bearer YOUR_ACCESS_TOKEN",
            "Content-Type": "application/json",
        },
        body: JSON.stringify({
            agent_types: [
                // TODO: Add blocked agent types
            ],
            disallow: "/",
        }),
    })
}

Step 2: Test Your Integration

If your website is correctly connected, you should see the new rules in your website's robots.txt.