Gigabot

What is Gigabot?

About

Gigabot is the web crawler used by Gigablast, an independent U.S. search engine that crawls and indexes web content using its own proprietary search algorithms and database. You can see how often Gigabot visits your website by setting up Dark Visitors Agent Analytics.

Expected Behavior

Search engine crawlers do not adhere to a fixed visitation schedule for websites. The frequency of visits varies widely based on several factors, including popularity, the rate at which its content is updated, and the website's overall trustworthiness. Websites with fresh, high-quality content tend to be crawled more frequently, while less active or less reputable sites may be visited less often.

Type

Search Engine Crawler
Indexes web content for search engine results

Detail

Operated By Gigablast
Last Updated 13 hours ago

Insights

Top Website Robots.txts

1%
1% of top websites are blocking Gigabot
Learn How →

Country of Origin

United States
Gigabot normally visits from the United States

Global Traffic

The percentage of all internet traffic coming from Search Engine Crawlers

How Do I Get These Insights for My Website?
Use the WordPress plugin, Node.js package, or API to get started in seconds.

Robots.txt

Should I Block Gigabot?

Probably not. Search engine crawlers power search engines, which are a useful way for users to discover your website. In fact, blocking search engine crawlers could severely reduce your traffic.

How Do I Block Gigabot?

You can block Gigabot or limit its access by setting user agent token rules in your website's robots.txt. Set up Dark Visitors Agent Analytics to check whether it's actually following them.

How Do I Block All Search Engine Crawlers?
Serve a continuously updating robots.txt that blocks new search engine crawlers automatically.
User Agent String Gigabot/1.0
# In your robots.txt ...

User-agent: Gigabot # https://darkvisitors.com/agents/gigabot
Disallow: /

⚠️ Manual Robots.txt Editing Is Not Scalable

New agents are created every day. We recommend setting up Dark Visitors Automatic Robots.txt if you want to block all agents of this type.