LinkCheck by Siteimprove.com

What is LinkCheck by Siteimprove.com?

About

LinkCheck by Siteimprove.com is an uncategorized agent. If you think this is incorrect or can provide additional detail about its purpose, please contact us. You can see how often LinkCheck by Siteimprove.com visits your website by setting up Dark Visitors Agent Analytics.

Expected Behavior

Behavior will vary depending on whether this agent is a search engine crawler, data scraper, archiver, one-off fetcher, etc.

Type

Uncategorized
Not currently assigned a type

Detail

Last Updated 3 hours ago

Insights

Top Website Robots.txts

0%
0% of top websites are blocking LinkCheck by Siteimprove.com
Learn How →

Country of Origin

United States
LinkCheck by Siteimprove.com normally visits from the United States

Global Traffic

The percentage of all internet traffic coming from Uncategorized Agents

Top Visited Website Categories

Science
Health
Law and Government
Food and Drink
People and Society
How Do I Get These Insights for My Website?
Use the WordPress plugin, Node.js package, or API to get started in seconds.

Robots.txt

Should I Block LinkCheck by Siteimprove.com?

It's difficult to say without a type. Its purposes could either be good or bad for your website, depending on what it is.

How Do I Block LinkCheck by Siteimprove.com?

You can block LinkCheck by Siteimprove.com or limit its access by setting user agent token rules in your website's robots.txt. Set up Dark Visitors Agent Analytics to check whether it's actually following them.

How Do I Block All Uncategorized Agents?
Serve a continuously updating robots.txt that blocks new uncategorized agents automatically.
User Agent String Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/129.0.0.0 Safari/537.36 LinkCheck by Siteimprove.com
# In your robots.txt ...

User-agent: LinkCheck by Siteimprove.com # https://darkvisitors.com/agents/linkcheck-by-siteimprove-com
Disallow: /

⚠️ Manual Robots.txt Editing Is Not Scalable

New agents are created every day. We recommend setting up Dark Visitors Automatic Robots.txt if you want to block all agents of this type.