Perplexity-User
What is Perplexity-User?
About
Perplexity-User supports user actions within Perplexity. When users ask Perplexity a question, it might visit a web page to help provide an accurate answer and include a link to the page in its response. You can see how often Perplexity-User visits your website by setting up Dark Visitors Agent Analytics.
Agent Type
Expected Behavior
AI assistants fetch web content on-demand in response to specific user queries during conversations. Unlike crawlers that systematically index websites, these assistants make targeted, one-off requests to retrieve current information that supplements their training data. Traffic is unpredictable and driven entirely by what users ask about. You might see no requests for weeks, then sudden bursts when topics related to your content become relevant. The retrieved content is used to generate responses through retrieval-augmented generation (RAG).
Detail
Operated By | Perplexity |
Last Updated | 19 hours ago |
Insights
Top Website Robots.txts
Country of Origin
Global Traffic
The percentage of all internet traffic coming from AI Assistants
Top Visited Website Categories
Robots.txt
Should I Block Perplexity-User?
Probably not. AI assistants visit websites directly on behalf of human users, so blocking them will effectively block those users. This could lead to a poor user experience and possible negative sentiment about your website. Not blocking AI assistants will allow more human users to use your website as they choose.
How Do I Block Perplexity-User?
You can block Perplexity-User or limit its access by setting user agent token rules in your website's robots.txt. Set up Dark Visitors Agent Analytics to check whether it's actually following them.
User Agent String | Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; Perplexity-User/1.0; +https://perplexity.ai/perplexity-user) |
# In your robots.txt ...
User-agent: Perplexity-User # https://https://darkvisitors.com/agents/perplexity-user
Disallow: /
⚠️ Manual Robots.txt Editing Is Not Scalable
New agents are created every day. We recommend setting up Dark Visitors Automatic Robots.txt if you want to block all agents of this type.