Prevent SecureAuth Realms from Being Indexed by Search Sites

Follow

In order to prevent search engines from crawling your website, simply update the robots.txt file, which is located in D:\inetpub\wwwroot\robots.txt.


User-agent: *

Disallow:

This configuration allows all search engines to crawl your website.


User-agent: *
Disallow: /

This configuration disallows any search engine from crawling your website.


User-agent: Ezooms
Disallow: /

This configuration disallows a specific search engine from crawling your website.


Top 3 U.S. search engine User-agents
:
Googlebot
Yahoo! Slurp
bingbot

Common search engine User-agents blocked:
AhrefsBot
Baiduspider
Ezooms
MJ12bot
YandexBot

0 out of 0 found this helpful

Comments

0 comments

Please sign in to leave a comment.