cohere-training-data-crawler
What is cohere-training-data-crawler?
About
cohere-training-data-crawler is a web crawler operated by Cohere to download training data for its LLMs (Large Language Models) that power its enterprise AI products. Set up Dark Visitors agent analytics to see how often cohere-training-data-crawler visits your website.
Detail
Operator | Cohere |
Documentation | https://openai.com/index/introducing-operator/ |
Type
Expected Behavior
It's generally unclear how AI data scrapers choose which websites to crawl and how often to crawl them. They might choose to visits websites with a higher information density more frequently, depending on the type of AI models they're training. For example, it would make sense that an agent training an LLM (Large Language Model) would favor sites with a lot of regularly updating text content.
Analytics
Visits to Your Website
Half of your traffic probably comes from artificial agents, and there are more of them every day. Track their activity with agent analytics.
Set Up Agent AnalyticsOther Websites
Robots.txt
Should I Block cohere-training-data-crawler?
It's up to you. AI data scrapers usually download publicly available internet content, which is freely accessible by default. However, you might want to block them if you're concerned about attribution or how your creative work could be used in the resulting AI model.
How Do I Block cohere-training-data-crawler?
You can block cohere-training-data-crawler or limit its access by setting user agent token rules in your website's robots.txt. Set up Dark Visitors agent analytics to check whether it's actually following them.
User Agent Token | Description |
---|---|
cohere-training-data-crawler |
Should match instances of cohere-training-data-crawler |
# robots.txt
# This should block cohere-training-data-crawler
User-agent: cohere-training-data-crawler
Disallow: /
Recommended Solution
Instead of doing this manually, use automatic robots.txt to keep your rules updated with the latest AI scrapers, crawlers, and assistants automatically.
Set Up Automatic Robots.txt