Search by Keyword

Need help getting to the cloud? Just search for a specific topic that you're looking for help on.

How can I prevent specific pages or directories from being included in the Sitemap and Search Engines?

If you wish to prevent certain pages or directories from showing up in our sitemap and prevent search engines from indexing those pages all together, you can block our crawler and search engine crawlers by adding an exclude line to your public_html/robots.txt file.

For example:  To block all crawlers from indexing and storing you can add the following lines to your robots.txt file:

User-agent: * Disallow: /private

The User-agent line tells which crawlers to block.  * means all crawlers (Attracta, Google, Bing, Yahoo, etc).  The Disallow line means which folders or files in your site to block, one folder or file per Disallow line.

That means that if you wish to block both and, you would add these lines:

User-agent: * Disallow: /private Disallow: /staff/bios.html

More detailed information about using the robots.txt file can be found at


How Useful Was This To You?

No votes yet