← Back
Serve an Automatic Robots.txt From Your Node.js Backend
Overview
Use the NPM package to serve an automatic robots.txt from your website's Node.js backend in just a few seconds. Please contact us if you need help.
Step 1: Install the NPM Package
Download the package from NPM using the command line.
npm install @darkvisitors/sdk
Step 2: Initialize the Client
In your code, create an instance of DarkVisitors
with your project's access token.
import { DarkVisitors } from "@darkvisitors/sdk"
const darkVisitors = new DarkVisitors("YOUR_ACCESS_TOKEN")
- Navigate to the Projects page and open your project
- Copy your access token from the Settings page
- Back in your code, swap in in your access token where it says
YOUR_ACCESS_TOKEN
Step 3: Generate Your Robots.txt
Use the generateRobotsTxt
function. Select which AgentType
s you want to block, and a string specifying which URLs are disallowed (e.g. "/"
to disallow all paths). Allowed agent types include:
AI Agent
AI Assistant
AI Data Scraper
AI Search Crawler
Archiver
Developer Helper
Fetcher
Headless Agent
Intelligence Gatherer
Scraper
SEO Crawler
Search Engine Crawler
Security Scanner
Undocumented AI Agent
Uncategorized
Here's an example:
const robotsTXT = await darkVisitors.generateRobotsTxt.([
AgentType.AIDataScraper,
AgentType.Scraper,
AgentType.IntelligenceGatherer,
AgentType.SEOCrawler
], "/")
The return value is a plain text robots.txt string.
Step 4: Serve Your Robots.txt
Generate a robotsTXT
periodically (e.g. once per day), then cache and serve it from your website's /robots.txt
endpoint.