Have you ever wondered how to hide information from search engines?
What if you donāt want search engines crawling your websiteĀ administrationĀ folder or pages which you have more secure dataĀ on? What if you donāt want certain search engines to crawl any of your content at all?
That is what theĀ robots.txt file is for!Ā All you need to do is put it in your root folder and follow a few standard rules and your good to go! The robots.txt file is what you useĀ to tell search robots, also known as Web Wanderers, Crawlers, or Spiders, which pages you would like them not to visit. Robots.txt is by no means mandatory for search engines but generally search engines obey what they are asked not to do.
Advantages of Using robots.txt
-
Index Web Pages Faster
Robots can index your web pages faster if you tell them not to crawl the folders mentioned in robots.txt. Say for instance the images, javascript or css folder.
-
Avoid Duplicate Content Penalty
If you have multiple copy on variousĀ pages - maybeĀ one for printing and another for viewing in the browser, chances are high that Search Engines will penalize for duplicate content issue. Using robots.txt this can be avoided.
Important Considerations Before Using robots.txt:
- Malware Robots can ignore your /robots.txt. and scan the web for security vulnerabilities, also email address harvesters used by spammers will pay no attention to it all.
- The robots.txt file can be accessed by anyone and can see what sections of your server you don't want robots to crawl.
So don't try to use /robots.txt to hide information or to make the website secure. It;s just about showing the robots which are all doors are opened or closed for crawling.
Creating a robots.txt File
The location of robots.txt is very important. Create a regular text file in notepad called "robots.txtā and upload it to your root directory (www.your-site.com/robots.txt). Note that search engines will look for the robots.txt only in the root folder and not in any other location of the website. In other words, put it in the top-level directory of your web server. If no robots.txt exists on the root folder bots index everything they find along the way. Remember to use all lower case for the filename: "robots.txt", not "Robots.TXT.
The structure of a robots.txt is simple. it is an endless list of user agents and disallowed files and directories.
āUser-agent:ā are search engines' crawlers or bots and ādisallow:ā lists the files /directories to be excluded from indexing.
Click here to find a list of robots.
Some Great robots.txt Operators?
1. To exclude all robots from the entire server
User-agent: *
Disallow: /
Here all search engine crawlers are told (using *) not to index any of the pages in the website. ā/ā means root of the folder.
2. To exclude a single robot
User-agent: Googlebot-Image
Disallow: /
This will disallow Google image bot to crawl your website. These kind of tricks will help save the bandwidth of your host.
3. To exclude all robots from part of the server
User-agent: *
Disallow: /css/
Disallow: /services/internet-marketing.php
The above code will disallow the folder css and file internet-marketing.php under services folder.
Applicable to all bots.
4. Multiple robots giving different access like:
User-agent: *
Disallow: /
User-agent: MSNBot
Disallow: /images/
Disallow: /admin/
Here all bots are disallowed access but MSNBot can access all the folders except images
and admin folder.So the rules of specificity apply, not inheritance.
5. To allow all robots complete access
User-agent: *
Disallow:
There are no advanced features in robots.txt. You can do most of the things with the examples provided above.
Have any others to share?Ā Did you find this helpful?Ā Tell us about it below!
on
Awesome article for ROBOT.TXT Actually i have a lot of doubts about that. I was used your techniques. Very helpful for who one rank their websites. Thank you for sharing
on
Amazing and Informative post..
on
Great idea! Thanks for sharing! Will use it immediately
on
Thanks for the very detailed explanation. Good practical guide on the different usages of robots.txt. It would be further perfect if you would have added one example with Yahoo robots.txt (User-agent: Slurp).
on
Informative post Jay. Well I was not aware of all these robots.txt operators. This post has helped me in getting a better idea about robots.txt operators.Thanks!