What Is DotBot?
DotBot is Moz's web crawler that gathers web data for the Moz Link Index, providing data used in Moz Pro campaigns, Link Explorer, and the Moz Links API to analyze website backlinks and SEO metrics. You can see how often DotBot visits your website by setting up Dark Visitors Agent Analytics.
Agent Type
Expected Behavior
SEO crawlers analyze websites to gather search optimization data like keyword rankings, backlink profiles, site structure, and page performance. Most of them also build up proprietary databases that power SEO analytics tools and competitive intelligence platforms. Crawl frequency varies based on factors like site authority, backlink popularity, ranking performance, and whether the site is actively monitored by the service's customers. These crawlers typically perform comprehensive site scans, following internal links to map site architecture and assess optimization opportunities.
Detail
Operated By | Moz |
Last Updated | 14 hours ago |
Top Website Robots.txts
Country of Origin
Top Website Blocking Trend Over Time
The percentage of the world's top 1000 websites who are blocking DotBot
Overall SEO Crawler Traffic
The percentage of all internet traffic coming from SEO crawlers
Top Visited Website Categories
User Agent String
Example | Mozilla/5.0 (compatible; DotBot/1.2; +https://opensiteexplorer.org/dotbot; help@moz.com) |
Robots.txt
In this example, all pages are blocked. You can customize which pages are off-limits by swapping out /
for a different disallowed path.
User-agent: DotBot # https://darkvisitors.com/agents/dotbot
Disallow: /
Frequently Asked Questions About DotBot
Should I Block DotBot?
Generally no. DotBot helps monitor and improve search performance across the web. If you use SEO tools yourself, blocking it undermines the ecosystem. However, you might limit DotBot's access if it consumes excessive server resources or crawls too aggressively.
How Do I Block DotBot?
If you want to, you can block or limit DotBot's access by configuring user agent token rules in your robots.txt file. The best way to do this is using Automatic Robots.txt, which blocks all agents of this type and updates continuously as new agents are released. While the vast majority of agents operated by reputable companies honor these robots.txt directives, bad actors may choose to ignore them entirely. In that case, you'll need to implement alternative blocking methods such as firewall rules or server-level restrictions. You can verify whether DotBot is respecting your rules by setting up Agent Analytics to monitor its visits to your website.
Will Blocking DotBot Hurt My SEO?
Blocking SEO crawlers has minimal direct impact on your search rankings since they're not the ones who do the actual search indexing. However, these tools help the SEO ecosystem function by providing competitive analysis and optimization insights. Widespread blocking could reduce overall SEO tool effectiveness.
Does DotBot Access Private Content?
SEO crawlers typically analyze publicly accessible content to gather optimization insights. They generally don't attempt to access private or authenticated content, focusing instead on pages that are publicly indexable. However, some advanced SEO tools may analyze login pages, error pages, or other publicly accessible but sensitive areas to assess site security and structure.
How Can I Tell if DotBot Is Visiting My Website?
Setting up Agent Analytics will give you realtime visibility into DotBot visiting your website, along with hundreds of other AI agents, crawlers, and scrapers. This will also let you measure human traffic to your website coming from AI search and chat LLM platforms like ChatGPT, Perplexity, and Gemini.
Why Is DotBot Visiting My Website?
DotBot likely found your site through public web discovery, competitor analysis, or because your domain appears in backlink databases, ranking reports, or other SEO intelligence sources. Your site may be monitored by their customers or identified as relevant for competitive analysis.
How Can I Authenticate Visits From DotBot?
Agent Analytics authenticates agent visits from many agents, letting you know whether each one was actually from that agent, or spoofed by a bad actor. This helps you identify suspicious traffic patterns and make informed decisions about blocking or allowing specific user agents.