Slackbot-LinkExpanding

What is Slackbot-LinkExpanding?

About

Slackbot-LinkExpanding is Slack's link unfurling bot that fetches web page content when users post links in channels, extracting meta tags like oEmbed, Twitter Cards, and Open Graph data to display rich media previews. You can see how often Slackbot-LinkExpanding visits your website by setting up Dark Visitors Agent Analytics.

Expected Behavior

Fetchers visit websites on-demand. They're normally dispatched by apps to present metadata about a link (e.g. the title and thumbnail image) to a user.

Type

Fetcher
Fetches web content on behalf of an app

Detail

Operated By Slack
Last Updated 16 hours ago

Insights

Top Website Robots.txts

0%
0% of top websites are blocking Slackbot-LinkExpanding
Learn How →

Country of Origin

United States
Slackbot-LinkExpanding normally visits from the United States

Global Traffic

The percentage of all internet traffic coming from Fetchers

Top Visited Website Categories

Internet and Telecom
Computers and Electronics
News
Real Estate
Business and Industrial
How Do I Get These Insights for My Website?
Use the WordPress plugin, Node.js package, or API to get started in seconds.

Robots.txt

Should I Block Slackbot-LinkExpanding?

Probably not. If you block a fetcher, it won't be able to fetch the metadata it needs to display a link preview. Fewer people will click on the link to your website without this.

How Do I Block Slackbot-LinkExpanding?

You can block Slackbot-LinkExpanding or limit its access by setting user agent token rules in your website's robots.txt. Set up Dark Visitors Agent Analytics to check whether it's actually following them.

How Do I Block All Fetchers?
Serve a continuously updating robots.txt that blocks new fetchers automatically.
User Agent String Slackbot-LinkExpanding 1.0 (+https://api.slack.com/robots)
# In your robots.txt ...

User-agent: Slackbot-LinkExpanding # https://darkvisitors.com/agents/slackbot-linkexpanding
Disallow: /

⚠️ Manual Robots.txt Editing Is Not Scalable

New agents are created every day. We recommend setting up Dark Visitors Automatic Robots.txt if you want to block all agents of this type.

References