The robots.txt file is used to allow or block crawling by search engine bots on your website. When you install your WordPress blog this file is not created by default. However it has a real importance for the Page ranking further on. Here are a few tricks for WordPress users.
First things first, as it is not created by default you will have to create it manually and add it to the root of your website.
However you can create it automatically by choosing the “I would like to block search engines, but allow normal visitors” option in the admin panel. It will then create the robots.txt file and block your website against search engines crawling. If you would like to change this option later on you can do it from your admin panel, but it will not change the robots.txt file. This bug is known to happen with the latest version of WordPress 1.6.2, so you will have to edit the file manually to allow robots to crawl your website again.
For instance if you want all robots to be able to crawl the website again, you will have to edit the file with the following lines:
You can have more information about the robots.txt file on the web robots page. Thus you will be able to hide some parts of your website or select which search engines you want to allow and the ones you do not want.