SiteLockSpider

What is SiteLockSpider?

About

SiteLockSpider is SiteLock's web scanning service that crawls websites to detect malware, malicious code, and security vulnerabilities for website protection. You can see how often SiteLockSpider visits your website by setting up Dark Visitors Agent Analytics.

Expected Behavior

Security scanners do not follow a predictable schedule when visiting websites. Their scans can be one-time, occasional, or recurring depending on the purpose of the scanner and the organization's security practices. The frequency and depth of their scans can vary based on factors like the visibility of the site on the public internet, past scan results, and inclusion in external threat intelligence feeds.

Type

Security Scanner
Scans websites to find vulnerabilities

Detail

Operated By SiteLock
Last Updated 10 hours ago

Insights

Top Website Robots.txts

0%
0% of top websites are blocking SiteLockSpider
Learn How →

Country of Origin

United States
SiteLockSpider normally visits from the United States

Global Traffic

The percentage of all internet traffic coming from Security Scanners

Top Visited Website Categories

Travel and Transportation
Real Estate
Health
Business and Industrial
Books and Literature
How Do I Get These Insights for My Website?
Use the WordPress plugin, Node.js package, or API to get started in seconds.

Robots.txt

Should I Block SiteLockSpider?

Probably not. Security scanners can be beneficial, especially if they're configured to report issues back to you.

How Do I Block SiteLockSpider?

You can block SiteLockSpider or limit its access by setting user agent token rules in your website's robots.txt. Set up Dark Visitors Agent Analytics to check whether it's actually following them.

How Do I Block All Security Scanners?
Serve a continuously updating robots.txt that blocks new security scanners automatically.
User Agent String SiteLockSpider [en] (WinNT; I ;Nav)
# In your robots.txt ...

User-agent: SiteLockSpider # https://darkvisitors.com/agents/sitelockspider
Disallow: /

⚠️ Manual Robots.txt Editing Is Not Scalable

New agents are created every day. We recommend setting up Dark Visitors Automatic Robots.txt if you want to block all agents of this type.