← Back
Set Up Robots.txt Categories for Your Node.js Backend
Overview
Use the NPM package to connect your Node.js website to Robots.txt Categories in just a few seconds. Please contact us if you need help getting set up.
Step 1: Install the NPM Package
Download the package from NPM using the command line.
npm install @darkvisitors/sdk
Step 2: Initialize the Client
In your code, create an instance of DarkVisitors with your project's access token.
import { DarkVisitors } from "@darkvisitors/sdk"
const darkVisitors = new DarkVisitors("YOUR_ACCESS_TOKEN")
- Navigate to the Dark Visitors Projects page and open your project
- Copy your access token from the Settings page
- Back in your code, swap in your access token where it says
YOUR_ACCESS_TOKEN
Step 3: Generate Your Robots.txt
Use the generateRobotsTxt function. Select which AgentTypes you want to block, and a string specifying which URLs are disallowed (e.g. "/" to disallow all paths). Allowed agent types include:
AI AgentAI AssistantAI Data ScraperAI Search CrawlerArchiverDeveloper HelperFetcherAutomated AgentIntelligence GathererScraperSEO CrawlerSearch Engine CrawlerSecurity ScannerUndocumented AI AgentUncategorized
Here's an example:
const robotsTXT = await darkVisitors.generateRobotsTxt.([
AgentType.AIDataScraper,
AgentType.Scraper,
AgentType.IntelligenceGatherer,
AgentType.SEOCrawler
], "/")
The return value is a plain text robots.txt string.
Step 4: Serve Your Robots.txt
Generate a robotsTXT periodically (e.g. once per day), then cache and serve it from your website's /robots.txt endpoint.