ClaudeBot

Last updated 14 hours ago.

What is ClaudeBot?

About

ClaudeBot is a web crawler operated by Anthropic to download training data for its LLMs (Large Language Models) that power AI products like Claude.

Track ClaudeBot Visiting Your Website
You can see when ClaudeBot visits your website using the API or WordPress plugin.

Detail

Operator Anthropic
Documentation https://support.anthropic.com/en/articles/8896518-does-anthropic-crawl-data-from-the-web-and-how-can-site-owners-block-the-crawler

Type

AI Data Scraper
Downloads web content to train AI models

Expected Behavior

It's generally unclear how AI data scrapers choose which websites to crawl and how often to crawl them. They might choose to visits websites with a higher information density more frequently, depending on the type of AI models they're training. For example, it would make sense that an agent training an LLM (Large Language Model) would favor sites with a lot of regularly updating text content.

Insights

ClaudeBot's Activity on Your Website

Half of your traffic probably comes from artificial agents, and there are more of them every day. Track their activity with the API or WordPress plugin.

Set Up Agent Analytics

Other Websites

13%
of top websites are currently blocking ClaudeBot in some way
Learn How →

Access Control

Should I Block ClaudeBot?

It's up to you. AI data scrapers usually download publicly available internet content, which is freely accessible by default. However, you might want to block them if you're concerned about attribution or how your creative work could be used in the resulting AI model.

Using Robots.txt

User Agent Token Description
ClaudeBot Should match instances of ClaudeBot

You can block ClaudeBot or limit its access by setting user agent token rules in your website's robots.txt.

# robots.txt
# This should block ClaudeBot

User-agent: ClaudeBot
Disallow: /

Instead of doing this manually, you can use the API or Wordpress plugin to keep your robots.txt updated with the latest known AI scrapers, crawlers, and assistants automatically.

Set Up Automatic Robots.txt