Factset_spyderbot

Last updated 1 day ago.

What is Factset_spyderbot?

About

Factset_spyderbot is an AI data scraper operated by Factset. It's not currently known to be artificially intelligent or AI-related. If you think that's incorrect or can provide more detail about its purpose, please contact us. You can see how often Factset_spyderbot visits your website by setting up Dark Visitors agent analytics.

Detail

Operator Factset

Type

AI Data Scraper
Downloads web content to train AI models

Expected Behavior

It's generally unclear how AI data scrapers choose which websites to crawl and how often to crawl them. They might choose to visits websites with a higher information density more frequently, depending on the type of AI models they're training. For example, it would make sense that an agent training an LLM (Large Language Model) would favor sites with a lot of regularly updating text content.

Agent Analytics

Visits to Your Website

Half of your traffic probably comes from artificial agents, and there are more of them every day. Track their activity with agent analytics.

Top Websites

0%
0% of top websites are blocking Factset_spyderbot in their robots.txt
Learn How →
★★★★★ 4.8 Out of 5 Stars on WordPress
Dark Visitors is trusted by thousands of websites across the internet.

Robots.txt

Should I Block Factset_spyderbot?

It's up to you. AI data scrapers usually download publicly available internet content, which is freely accessible by default. However, you might want to block them if you're concerned about attribution or how your creative work could be used in the resulting AI model.

How Do I Block Factset_spyderbot?

You can block Factset_spyderbot or limit its access by setting user agent token rules in your website's robots.txt. Set up Dark Visitors agent analytics to check whether it's actually following them.

User Agent Token Description
Factset_spyderbot Should match instances of Factset_spyderbot
# robots.txt
# This should block Factset_spyderbot

User-agent: Factset_spyderbot
Disallow: /

Recommended Solution

Instead of doing this manually, use automatic robots.txt to keep your rules updated with the latest AI scrapers, crawlers, and assistants automatically.

Global Bot & LLM Traffic

AI Data Scrapers

The overall volume of internet traffic coming from AI Data Scrapers

See Global Bot & LLM Traffic →