Knowledge Base

Support Policies

Prevent SecureAuth Realms from Being Indexed by Search Sites

In order to prevent search engines from crawling your website, simply update the robots.txt file, which is located in D:\inetpub\wwwroot\robots.txt.


User-agent: *

Disallow:

This configuration allows all search engines to crawl your website.


User-agent: *
Disallow: /

This configuration disallows any search engine from crawling your website.


User-agent: Ezooms
Disallow: /

This configuration disallows a specific search engine from crawling your website.


Top 3 U.S. search engine User-agents
:
Googlebot
Yahoo! Slurp
bingbot

Common search engine User-agents blocked:
AhrefsBot
Baiduspider
Ezooms
MJ12bot
YandexBot

Have more questions? Submit a request

0 Comments

Please sign in to leave a comment.