← Back

Serve an Automatic Robots.txt From Your Node.js Backend

Overview

Use the NPM package to serve an automatic robots.txt from your website's Node.js backend in just a few seconds. Please contact us if you need help.

Step 1: Install the NPM Package

Download the package from NPM using the command line.

npm install @darkvisitors/sdk

Step 2: Initialize the Client

In your code, create an instance of DarkVisitors with your project's access token.

import { DarkVisitors } from "@darkvisitors/sdk"
            
const darkVisitors = new DarkVisitors("YOUR_ACCESS_TOKEN")

Step 3: Generate Your Robots.txt

Use the generateRobotsTxt function. Select which AgentTypes you want to block, and a string specifying which URLs are disallowed (e.g. "/" to disallow all paths). Allowed agent types include:

Here's an example:

const robotsTXT = await darkVisitors.generateRobotsTxt.([
    AgentType.AIDataScraper,
    AgentType.Scraper,
    AgentType.IntelligenceGatherer,
    AgentType.SEOCrawler
], "/")

The return value is a plain text robots.txt string.

Step 4: Serve Your Robots.txt

Generate a robotsTXT periodically (e.g. once per day), then cache and serve it from your website's /robots.txt endpoint.