What Is GuestpostsBot?
GuestpostsBot is a specialized web crawler that monitors websites registered on the guestposts.com.br platform, tracking guest post partnerships, validating site existence for registration, and monitoring site status for owners. You can see how often GuestpostsBot visits your website by setting up Dark Visitors Agent Analytics.
Agent Type
Expected Behavior
Fetchers retrieve metadata from web pages to generate link previews in social media platforms, messaging apps, and content aggregators. They're triggered on-demand when users share or post links, fetching information like titles, descriptions, and thumbnail images. Traffic is unpredictable and correlates with how often your content is shared. Viral content may trigger thousands of fetcher requests in a short period. Fetchers typically access only the shared URL rather than crawling your site.
Detail
Operated By | Guest Posts |
Last Updated | 21 hours ago |
Top Website Robots.txts
Country of Origin
Top Website Blocking Trend Over Time
The percentage of the world's top 1000 websites who are blocking GuestpostsBot
Overall Fetcher Traffic
The percentage of all internet traffic coming from fetchers
Robots.txt
In this example, all pages are blocked. You can customize which pages are off-limits by swapping out /
for a different disallowed path.
User-agent: GuestpostsBot # https://darkvisitors.com/agents/guestpostsbot
Disallow: /
Frequently Asked Questions About GuestpostsBot
Should I Block GuestpostsBot?
No. Blocking fetchers prevents link previews from appearing when your content is shared on social media, messaging apps, and other platforms. This significantly reduces click-through rates and social engagement. Link previews are crucial for content distribution.
How Do I Block GuestpostsBot?
If you want to, you can block or limit GuestpostsBot's access by configuring user agent token rules in your robots.txt file. The best way to do this is using Automatic Robots.txt, which blocks all agents of this type and updates continuously as new agents are released. While the vast majority of agents operated by reputable companies honor these robots.txt directives, bad actors may choose to ignore them entirely. In that case, you'll need to implement alternative blocking methods such as firewall rules or server-level restrictions. You can verify whether GuestpostsBot is respecting your rules by setting up Agent Analytics to monitor its visits to your website.
Will Blocking GuestpostsBot Hurt My SEO?
Blocking fetchers will hurt your social SEO and content distribution. Link previews significantly improve click-through rates from social media, messaging apps, and other platforms. Without previews, your content appears less engaging when shared, reducing social signals that can indirectly benefit search rankings.
Does GuestpostsBot Access Private Content?
Fetchers only access the specific URLs that users share or embed, without credentials or authentication. They're designed to retrieve publicly accessible metadata and preview information. Fetchers don't crawl beyond the shared URL and can't access private content unless the shared link itself provides public access to otherwise private information.
How Can I Tell if GuestpostsBot Is Visiting My Website?
Setting up Agent Analytics will give you realtime visibility into GuestpostsBot visiting your website, along with hundreds of other AI agents, crawlers, and scrapers. This will also let you measure human traffic to your website coming from AI search and chat LLM platforms like ChatGPT, Perplexity, and Gemini.
Why Is GuestpostsBot Visiting My Website?
GuestpostsBot visited your site because someone shared one of your URLs on a social platform, messaging app, or another service that generates link previews. The fetcher was triggered when the link was posted to retrieve your page's title, description, and preview image.
How Can I Authenticate Visits From GuestpostsBot?
Agent Analytics authenticates agent visits from many agents, letting you know whether each one was actually from that agent, or spoofed by a bad actor. This helps you identify suspicious traffic patterns and make informed decisions about blocking or allowing specific user agents.