SiteCheck-sitecrawl

What is SiteCheck-sitecrawl?

About

SiteCheck-sitecrawl is Siteimprove's crawler that performs comprehensive website audits for their content intelligence platform, scanning sites for accessibility, SEO, quality assurance, and digital governance compliance. You can see how often SiteCheck-sitecrawl visits your website by setting up Dark Visitors Agent Analytics.

Expected Behavior

SEO (Search Engine Optimization) crawlers do not adhere to a fixed visitation schedule for websites. The frequency of visits varies widely based on several factors, including the search keywords it ranks for, and how many other websites link to it.

Type

SEO Crawler
Discovers search engine optimization insights

Detail

Operated By Siteimprove
Last Updated 10 hours ago

Insights

Top Website Robots.txts

0%
0% of top websites are blocking SiteCheck-sitecrawl
Learn How →

Country of Origin

United States
SiteCheck-sitecrawl normally visits from the United States

Global Traffic

The percentage of all internet traffic coming from SEO Crawlers

How Do I Get These Insights for My Website?
Use the WordPress plugin, Node.js package, or API to get started in seconds.

Robots.txt

Should I Block SiteCheck-sitecrawl?

Probably not, especially if you benefit from an SEO service yourself. However, you might choose to block them if you're concerned about things like server resource usage.

How Do I Block SiteCheck-sitecrawl?

You can block SiteCheck-sitecrawl or limit its access by setting user agent token rules in your website's robots.txt. Set up Dark Visitors Agent Analytics to check whether it's actually following them.

How Do I Block All SEO Crawlers?
Serve a continuously updating robots.txt that blocks new SEO crawlers automatically.
User Agent String Mozilla/5.0 (compatible; MSIE 10.0; Windows NT 6.1; Trident/6.0) SiteCheck-sitecrawl by Siteimprove.com
# In your robots.txt ...

User-agent: SiteCheck-sitecrawl # https://darkvisitors.com/agents/sitecheck-sitecrawl
Disallow: /

⚠️ Manual Robots.txt Editing Is Not Scalable

New agents are created every day. We recommend setting up Dark Visitors Automatic Robots.txt if you want to block all agents of this type.

References