Prevent SecureAuth Realms from Being Indexed by Search Sites

Follow
    Applies to:
  • Legacy SecureAuth IdP
Deployment model:
  • On Premises
  • Description: External SecureAuth realms can be discovered through search engines.

    Cause: These external sites are publicly accessible.

    Resolution: In order to prevent search engines from crawling your website, simply update the robots.txt file, which is located in D:\inetpub\wwwroot\robots.txt.


    User-agent: *

    Disallow:

    This configuration allows all search engines to crawl your website.


    User-agent: *
    Disallow: /

    This configuration disallows any search engine from crawling your website.


    User-agent: Ezooms
    Disallow: /

    This configuration disallows a specific search engine from crawling your website.


    Top 3 U.S. search engine User-agents
    :
    Googlebot
    Yahoo! Slurp
    bingbot

    Common search engine User-agents blocked:
    AhrefsBot
    Baiduspider
    Ezooms
    MJ12bot
    YandexBot

     

    SecureAuth Knowledge Base Articles provide information based on specific use cases and may not apply to all appliances or configurations. Be advised that these instructions could cause harm to the environment if not followed correctly or if they do not apply to the current use case.

    Customers are responsible for their own due diligence prior to utilizing this information and agree that SecureAuth is not liable for any issues caused by misconfiguration directly or indirectly related to SecureAuth products.

    0 out of 0 found this helpful

    Comments

    0 comments

    Please sign in to leave a comment.