Main Page Beginners Design Tips Get Traffic Earn Money Picks Codes

Are you a Successful Blogger?

Successful Blogging

  • Beginners can learn the basics about Blogging
  • Advanced Bloggers can learn how to take their blog up a notch.
  • Anyone can learn HOW TO GET MORE TRAFFIC
  • And more- Templates, Codes, and Tips etc.

What do you want to read about? Scroll down or check out our sidebar for topics.

We post at least daily so check back or subscribe to our newsletter.

Successful Blogging.com

How to keep robots out of your sight.

As hard as it may be to believe, there may be a time when you do not want robots crawling your webpages.

According to robotstxt.org/ , Web Robots (also known as Web Wanderers, Crawlers, or Spiders), are programs that traverse the Web automatically. Search engines such as Google use them to index the web content, spammers use them to scan for email addresses, and they have many other uses.

Specifying where search engines should look for content in high-quality directories or files you can increase the ranking of your site, and is recommended by Google and all the search engines.




A robots.txt can tell search engine spiders not to crawl or index certain sections or pages of your site. You can use it to prevent indexing totally, prevent certain areas of your site from being indexes or to issue individual indexing instructions to specific search engines.

All search engines, or at least all the important ones, now look for a robots.txt

There are a number of situations where you may wish to exclude spiders from some or all of your site.

  1. You are still building the site do not want the unfinished work to appear in search engines
  2. You have information that, is of no interest to anyone but those it is intended for and you would prefer it did not appear in search engines.
  3. You would like to exclude some bots or spiders whose chief purpose is collecting email addresses.

Basically, the file will look like this.

It can be created using notepad text editor. Each entry has just two lines:

User-Agent: [Spider or Bot name]
Disallow: [Directory or File Name]

This line can be repeated for each directory or file you want to exclude, or for each spider or bot you want to exclude.

A few examples will make it clearer.

User-Agent: Googlebot
Disallow: /private/privatefile.htm






0 comments: