Nicecrawler

What is Nicecrawler?

About

Nicecrawler is an archiver operated by NiceCrawler. If you think this is incorrect or can provide additional detail about its purpose, please contact us. You can see how often Nicecrawler visits your website by setting up Dark Visitors Agent Analytics.

Expected Behavior

Archivers visit websites on a roughly regular cadence, since snapshots are more useful when they're regularly spaced out. Popular websites will have more frequent visits since they are more likely to be queried in the historical database in the future.

Type

Archiver
Snapshots websites for historical databases

Detail

Operated By NiceCrawler
Last Updated 2 hours ago

Insights

Top Website Robots.txts

0%
0% of top websites are blocking Nicecrawler
Learn How →

Country of Origin

United States
Nicecrawler normally visits from the United States

Global Traffic

The percentage of all internet traffic coming from Archivers

Top Visited Website Categories

Home and Garden
Travel and Transportation
Hobbies and Leisure
Business and Industrial
People and Society
How Do I Get These Insights for My Website?
Use the WordPress plugin, Node.js package, or API to get started in seconds.

Robots.txt

Should I Block Nicecrawler?

It's up to you. Digital archiving is generally done to preserve a historical record. If you don't want to be part of that record for some reason, you can block archivers.

How Do I Block Nicecrawler?

You can block Nicecrawler or limit its access by setting user agent token rules in your website's robots.txt. Set up Dark Visitors Agent Analytics to check whether it's actually following them.

How Do I Block All Archivers?
Serve a continuously updating robots.txt that blocks new archivers automatically.
User Agent String Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; Nicecrawler/1.1; +http://www.nicecrawler.com/) Chrome/90.0.4430.97 Safari/537.36
# In your robots.txt ...

User-agent: Nicecrawler # https://darkvisitors.com/agents/nicecrawler
Disallow: /

⚠️ Manual Robots.txt Editing Is Not Scalable

New agents are created every day. We recommend setting up Dark Visitors Automatic Robots.txt if you want to block all agents of this type.