PetalBot

What is PetalBot?

About

PetalBot is Huawei's web crawler that indexes PC and mobile websites to build search databases for Petal Search engine and provide AI-powered content recommendations for Huawei Assistant and other Huawei AI services. You can see how often PetalBot visits your website by setting up Dark Visitors Agent Analytics.

Expected Behavior

AI search crawlers do not adhere to a fixed visitation schedule for websites. The frequency of visits varies widely based on many factors, and can even happen on-demand in response to a user query.

Type

AI Search Crawler
Indexes web content for AI-powered search results

Detail

Operated By Huawei
Last Updated 13 hours ago

Insights

Top Website Robots.txts

6%
6% of top websites are blocking PetalBot
Learn How →

Country of Origin

Singapore
PetalBot normally visits from Singapore

Global Traffic

The percentage of all internet traffic coming from AI Search Crawlers

Top Visited Website Categories

Health
Shopping
Pets and Animals
Sports
Travel and Transportation
How Do I Get These Insights for My Website?
Use the WordPress plugin, Node.js package, or API to get started in seconds.

Robots.txt

Should I Block PetalBot?

Probably not. AI search crawlers power AI search engines, which are a useful way for users to discover your website. AI search engine assistants usually cite their sources and provide links to their results.

How Do I Block PetalBot?

You can block PetalBot or limit its access by setting user agent token rules in your website's robots.txt. Set up Dark Visitors Agent Analytics to check whether it's actually following them.

How Do I Block All AI Search Crawlers?
Serve a continuously updating robots.txt that blocks new AI search crawlers automatically.
User Agent String Mozilla/5.0 (Linux; Android 7.0;) AppleWebKit/537.36 (KHTML, like Gecko) Mobile Safari/537.36 (compatible; PetalBot;+https://webmaster.petalsearch.com/site/petalbot)
# In your robots.txt ...

User-agent: PetalBot # https://darkvisitors.com/agents/petalbot
Disallow: /

⚠️ Manual Robots.txt Editing Is Not Scalable

New agents are created every day. We recommend setting up Dark Visitors Automatic Robots.txt if you want to block all agents of this type.

References