Turnitin

What is Turnitin?

About

Turnitin crawler gathers web content to build a comprehensive database for plagiarism detection services, comparing student papers against internet content for academic integrity. You can see how often Turnitin visits your website by setting up Dark Visitors Agent Analytics.

Agent Type

Archiver
Snapshots websites for historical databases

Expected Behavior

Archivers crawl websites to create historical snapshots for preservation purposes. They typically visit on a regular cadence to build a chronological record of how content changes over time. Crawl frequency varies based on site popularity and content update patterns. Unlike search crawlers, archivers aim to capture and store complete page states rather than extract information for indexing.

Detail

Operated By Turnitin
Last Updated 1 day ago

Insights

Top Website Robots.txts

0%
0% of top websites are blocking Turnitin
Learn How →

Country of Origin

United States
Turnitin normally visits from the United States

Global Traffic

The percentage of all internet traffic coming from Archivers

Top Visited Website Categories

People and Society
Hobbies and Leisure
Arts and Entertainment
Books and Literature
Real Estate
How Do I Get These Insights for My Website?
Use the WordPress plugin, Node.js package, or API to get started in seconds.

Robots.txt

Should I Block Turnitin?

It's up to you. Digital archiving is generally done to preserve a historical record. If you don't want to be part of that record for some reason, you can block archivers.

How Do I Block Turnitin?

You can block Turnitin or limit its access by setting user agent token rules in your website's robots.txt. Set up Dark Visitors Agent Analytics to check whether it's actually following them.

How Do I Block All Archivers?
Serve a continuously updating robots.txt that blocks new archivers automatically.
User Agent String Turnitin (https://bit.ly/2UvnfoQ)
# In your robots.txt ...

User-agent: Turnitin # https://darkvisitors.com/agents/turnitin
Disallow: /

⚠️ Manual Robots.txt Editing Is Not Scalable

New agents are created every day. We recommend setting up Dark Visitors Automatic Robots.txt if you want to block all agents of this type.

References