SecurityHeaders

What is SecurityHeaders?

About

SecurityHeaders is a free security scanning service that analyzes HTTP security headers on websites, providing grades and recommendations to improve web security posture and protect against common vulnerabilities. You can see how often SecurityHeaders visits your website by setting up Dark Visitors Agent Analytics.

Expected Behavior

Security scanners do not follow a predictable schedule when visiting websites. Their scans can be one-time, occasional, or recurring depending on the purpose of the scanner and the organization's security practices. The frequency and depth of their scans can vary based on factors like the visibility of the site on the public internet, past scan results, and inclusion in external threat intelligence feeds.

Type

Security Scanner
Scans websites to find vulnerabilities

Detail

Operated By Security Headers
Last Updated 1 day ago

Insights

Top Website Robots.txts

0%
0% of top websites are blocking SecurityHeaders
Learn How →

Country of Origin

United States
SecurityHeaders normally visits from the United States

Global Traffic

The percentage of all internet traffic coming from Security Scanners

Top Visited Website Categories

Hobbies and Leisure
Jobs and Education
Finance
Computers and Electronics
Travel and Transportation
How Do I Get These Insights for My Website?
Use the WordPress plugin, Node.js package, or API to get started in seconds.

Robots.txt

Should I Block SecurityHeaders?

Probably not. Security scanners can be beneficial, especially if they're configured to report issues back to you.

How Do I Block SecurityHeaders?

You can block SecurityHeaders or limit its access by setting user agent token rules in your website's robots.txt. Set up Dark Visitors Agent Analytics to check whether it's actually following them.

How Do I Block All Security Scanners?
Serve a continuously updating robots.txt that blocks new security scanners automatically.
User Agent String Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/132.0.0.0 Safari/537.36 SecurityHeaders
# In your robots.txt ...

User-agent: SecurityHeaders # https://darkvisitors.com/agents/securityheaders
Disallow: /

⚠️ Manual Robots.txt Editing Is Not Scalable

New agents are created every day. We recommend setting up Dark Visitors Automatic Robots.txt if you want to block all agents of this type.

References