bnf.fr_bot

What is bnf.fr_bot?

About

bnf.fr_bot is the official web crawler of the Bibliothèque nationale de France (BNF), systematically collecting and archiving digital content from French websites to preserve France's national documentary heritage. You can see how often bnf.fr_bot visits your website by setting up Dark Visitors Agent Analytics.

Expected Behavior

Archivers visit websites on a roughly regular cadence, since snapshots are more useful when they're regularly spaced out. Popular websites will have more frequent visits since they are more likely to be queried in the historical database in the future.

Type

Archiver
Snapshots websites for historical databases

Detail

Operated By Bibliothèque nationale de France
Last Updated 16 hours ago

Insights

Top Website Robots.txts

0%
0% of top websites are blocking bnf.fr_bot
Learn How →

Country of Origin

France
bnf.fr_bot normally visits from France

Global Traffic

The percentage of all internet traffic coming from Archivers

Top Visited Website Categories

Beauty and Fitness
Food and Drink
Health
Business and Industrial
Arts and Entertainment
How Do I Get These Insights for My Website?
Use the WordPress plugin, Node.js package, or API to get started in seconds.

Robots.txt

Should I Block bnf.fr_bot?

It's up to you. Digital archiving is generally done to preserve a historical record. If you don't want to be part of that record for some reason, you can block archivers.

How Do I Block bnf.fr_bot?

You can block bnf.fr_bot or limit its access by setting user agent token rules in your website's robots.txt. Set up Dark Visitors Agent Analytics to check whether it's actually following them.

How Do I Block All Archivers?
Serve a continuously updating robots.txt that blocks new archivers automatically.
User Agent String Mozilla/5.0 (compatible; bnf.fr_bot; +http://www.bnf.fr/fr/outils/a.dl_web_capture_robot.html) Firefox/57
# In your robots.txt ...

User-agent: bnf.fr_bot # https://darkvisitors.com/agents/bnf-fr-bot
Disallow: /

⚠️ Manual Robots.txt Editing Is Not Scalable

New agents are created every day. We recommend setting up Dark Visitors Automatic Robots.txt if you want to block all agents of this type.

References