Alexa Site Audit
What is Alexa Site Audit?
About
Alexa Site Audit is a search engine crawler operated by Alexandria.org. If you think this is incorrect or can provide additional detail about its purpose, please contact us. You can see how often Alexa Site Audit visits your website by setting up Dark Visitors Agent Analytics.
Expected Behavior
Search engine crawlers do not adhere to a fixed visitation schedule for websites. The frequency of visits varies widely based on several factors, including popularity, the rate at which its content is updated, and the website's overall trustworthiness. Websites with fresh, high-quality content tend to be crawled more frequently, while less active or less reputable sites may be visited less often.
Type
Detail
Operated By | Alexandria.org |
Last Updated | 19 hours ago |
Insights
Top Website Robots.txts
Country of Origin
Global Traffic
The percentage of all internet traffic coming from Search Engine Crawlers
Robots.txt
Should I Block Alexa Site Audit?
Probably not. Search engine crawlers power search engines, which are a useful way for users to discover your website. In fact, blocking search engine crawlers could severely reduce your traffic.
How Do I Block Alexa Site Audit?
You can block Alexa Site Audit or limit its access by setting user agent token rules in your website's robots.txt. Set up Dark Visitors Agent Analytics to check whether it's actually following them.
User Agent String | Mozilla/5.0 (compatible; alexa site audit/1.0; +http://www.alexa.com/help/webmasters; ) |
# In your robots.txt ...
User-agent: Alexa Site Audit # https://darkvisitors.com/agents/alexa-site-audit
Disallow: /
⚠️ Manual Robots.txt Editing Is Not Scalable
New agents are created every day. We recommend setting up Dark Visitors Automatic Robots.txt if you want to block all agents of this type.