netarkivindsamling

What is netarkivindsamling?

About

netarkivindsamling is a web crawler operated by the Royal Danish Library that collects Danish internet content according to the Danish Legal Deposit Act for preservation and research purposes. You can see how often netarkivindsamling visits your website by setting up Dark Visitors agent analytics.

Expected Behavior

Archivers visit websites on a roughly regular cadence, since snapshots are more useful when they're regularly spaced out. Popular websites will have more frequent visits since they are more likely to be queried in the historical database in the future.

Type

Archiver
Snapshots websites for historical databases

Detail

Operated By Netarkivet
Last Updated 17 hours ago

Insights

Top Website Robots.txts

0%
0% of top websites are blocking netarkivindsamling
Learn How →

Country of Origin

Unknown
netarkivindsamling has no known country of origin

Global Traffic

The percentage of all internet traffic coming from Archivers

Get These Insights for Your Website
Use the WordPress plugin, Node.js package, or API to get started in seconds.

Robots.txt

Should I Block netarkivindsamling?

It's up to you. Digital archiving is generally done to preserve a historical record. If you don't want to be part of that record for some reason, you can block archivers.

How Do I Block netarkivindsamling?

⚠️ Manual Robots.txt Edits Are Not Scalable
New agents are created every day. Instead, serve a continuously updating robots.txt that blocks new agents automatically.

You can block netarkivindsamling or limit its access by setting user agent token rules in your website's robots.txt. Set up Dark Visitors agent analytics to check whether it's actually following them.

# robots.txt
# This should block netarkivindsamling

User-agent: netarkivindsamling
Disallow: /