SemanticScholarBot

Last updated 15 hours ago.

What is SemanticScholarBot?

About

SemanticScholarBot is a search engine crawler operated by Ai2. It's not currently known to be artificially intelligent or AI-related. If you think that's incorrect or can provide more detail about its purpose, please contact us. Set up Dark Visitors agent analytics to see how often SemanticScholarBot visits your website.

Detail

Operator Ai2
Documentation https://semanticscholar.org/crawler

Type

Search Engine Crawler
Indexes web content for search engine results

Expected Behavior

Search engine crawlers do not adhere to a fixed visitation schedule for websites. The frequency of visits varies widely based on several factors, including popularity, the rate at which its content is updated, and the website's overall trustworthiness. Websites with fresh, high-quality content tend to be crawled more frequently, while less active or less reputable sites may be visited less often.

Analytics

Visits to Your Website

Half of your traffic probably comes from artificial agents, and there are more of them every day. Track their activity with agent analytics.

Set Up Agent Analytics

Other Websites

0%
0% of top websites are blocking SemanticScholarBot
Learn How →

Robots.txt

Should I Block SemanticScholarBot?

Probably not. Search engine crawlers power search engines, which are a useful way for users to discover your website. In fact, blocking search engine crawlers could severely reduce your traffic.

How Do I Block SemanticScholarBot?

You can block SemanticScholarBot or limit its access by setting user agent token rules in your website's robots.txt. Set up Dark Visitors agent analytics to check whether it's actually following them.

User Agent Token Description
SemanticScholarBot Should match instances of SemanticScholarBot
# robots.txt
# This should block SemanticScholarBot

User-agent: SemanticScholarBot
Disallow: /

Recommended Solution

Instead of doing this manually, use automatic robots.txt to keep your rules updated with the latest AI scrapers, crawlers, and assistants automatically.

Set Up Automatic Robots.txt