Claude-SearchBot

What is Claude-SearchBot?

About

Claude-SearchBot is used to create an index of websites that can be surfaced as results in Anthropic's Claude AI assistant search feature. You can see how often Claude-SearchBot visits your website by setting up Dark Visitors Agent Analytics.

Expected Behavior

AI search crawlers do not adhere to a fixed visitation schedule for websites. The frequency of visits varies widely based on many factors, and can even happen on-demand in response to a user query.

Type

AI Search Crawler
Indexes web content for AI-powered search results

Detail

Operated By Anthropic
Last Updated 16 hours ago

Insights

Top Website Robots.txts

3%
3% of top websites are blocking Claude-SearchBot
Learn How →

Country of Origin

United States
Claude-SearchBot normally visits from the United States

Global Traffic

The percentage of all internet traffic coming from AI Search Crawlers

How Do I Get These Insights for My Website?
Use the WordPress plugin, Node.js package, or API to get started in seconds.

Robots.txt

Should I Block Claude-SearchBot?

Probably not. AI search crawlers power AI search engines, which are a useful way for users to discover your website. AI search engine assistants usually cite their sources and provide links to their results.

How Do I Block Claude-SearchBot?

You can block Claude-SearchBot or limit its access by setting user agent token rules in your website's robots.txt. Set up Dark Visitors Agent Analytics to check whether it's actually following them.

How Do I Block All AI Search Crawlers?
Serve a continuously updating robots.txt that blocks new AI search crawlers automatically.
User Agent String Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; Claude-SearchBot/1.0; +claude-searchbot@anthropic.com)
# In your robots.txt ...

User-agent: Claude-SearchBot # https://darkvisitors.com/agents/claude-searchbot
Disallow: /

⚠️ Manual Robots.txt Editing Is Not Scalable

New agents are created every day. We recommend setting up Dark Visitors Automatic Robots.txt if you want to block all agents of this type.