Custom Robot.txt File Generator

Search Engine Optimization

Custom Robot.txt File Generator

Default - All Robots are:  
Sitemap: (leave blank if you don't have) 
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo MM
  Yahoo Blogs
  DMOZ Checker
  MSN PicSearch
Restricted Directories: The path is relative to root and must contain a trailing slash "/"

Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.

About Custom Robot.txt File Generator

What is robots.txt file ?

Robots.txt is among the most important files should be viewed when you're looking to optimize your search engine (SEO).

This robots.txt file, located on the root directory of your website allows webmasters to control the way they are accessed by crawlers of search engines (bots) which allows them to save information about the default pages, or to stop certain pages or portions of your website.

It instructs bots on what areas of your website they should visit and which areas should not be visited when they crawl your website. The file typically contains guidelines for web crawlers. It's best not to use it in any manner that you don't wish for bots or any other user to visit your site, although this does not mean that the content is not covered by

This file will be inaccessible to bots who "ignore" the instructions provided.


The first line in the robots.txt file is known as "user-agent", which is the name given to all crawlers on search engines, and informs them of what they need to do to access your website. For example, you can direct all bots coming from Google(tm) and any specific bot to a different place by adding this line: Googlebot: User-agent. For more information on user-agents take a look at this .

In order to allow or disallow crawling of a particular folder you need to add a line like this one: Disallow: /administrator/modules meaning that bots cannot access anything under administrator/modules/. If there are some pages that you don't want indexed you can use Disallow: /administrator/modules/xxxxxx meaning that bots won't be able to access that specific folder or any subfolder under it.

It is also possible to block crawling on your entire website, so you should include this line at the end of the robots.txt document: Block It will inform crawlers to not browse your site However, there are some exceptions, such as Google(tm) Search Appliance which is a particular type of bot that does not have access to robots.txt files and will crawl every page of your website.

There are plenty more things that you should know about this file, for instance "Sitemap" lines which are used for indexing purposes if you think there's too much to be explored on your site you need to let Google(tm) know that by adding Sitemap: which should point to your sitemap file located on the root directory of your domain.

There are alternatives to this Take a look at them here .

It is important to avoid making mistakes when writing robots txt file since if you do it, bots could overlook it, or all bots may ignore certain parts of the instruction due to many errors within the program. It is recommended that someone else examines your file prior to publishing it so that they can spot and eliminate any mistakes that you've committed, or misinterpreted information about which you can find on this website .

Although the majority of bots can't "read" robots.txt files, there are exceptions, such as Google(tm) that crawls your site by examining what's written in the file. However, this doesn't mean they won't be able to access your website even if you've deleted its entry. If you'd like to permit crawling or block crawling of the specific page of your website, simply add the code prior to the link's href tag on this page, so that there's no confusion as to what the bot is supposed to do with it.

For instance, if you've just created an image gallery, with hyperlinks to all images, including one called "gallery_bottom" and wanted only Google(tm) to be able to view it. Include the following code:


before this line:


so that the code will look like this:



and you're all done! Hope this helped with your robots.txt file problems.


How to use robot.txt generator

1.) Simply type in your blog's URL , then click create robot.txt file. This will take you to the robot.text creator's page, where the robot's rules are changed or added easily.

2.) Once the robot rules are saved, press the create robot.txt button. This will take you to the robot.txt file download page . Robots.text file for Blogger can be downloaded in a matter of seconds .

3.) Thirdly Robots.txt file must be uploaded to the server for the root directory of your blog because it's not worth using these rules if they are placed under directories since the majority of robot crawlers do not look through directories.


Benefits of google robots.txt for blogger

robots.txt refers to standard for robot exclusion. It is the most effective method to inform web crawlers about your sitemap and to prevent crawling on specific pages on your site.

Google robots are extremely beneficial in boosting your blog's SEO rankings. This creates robot.txt files recommended by Google robots for bloggers, or a new blogger who has just begun blogging or for new websites since robot.txt file can be a significant advantage to the blogger however, it also assists you maintain your blog more efficiently, arrange the content and ensure that you can update or delete any item at any time.

The standard for robot exclusion recommends that robot owners place rules for exclusion of robots at the their root level (i.e. the Home page) as there's no benefit in using these rules if they're placed under directories , as the majority of robot crawlers do not look through directories.

Benefits of Robots.text Generator standard for Blogger:

1.) It can help you conserve your bandwidth as well as server storage space.

2.) Robots.text Generator rules assist to keep your blog running smoothly by allowing you add or remove any content at any time.

3.) With the robot exclusion feature, Blogger can further optimize its crawling and indexing processes by removing blocked URLs from its.

4)Robots.text generator rules also assist in reducing the cost of advertisements since with a lot of ads, robots are more likely to block URLs that might decrease your ranking in SERP due to the lower content information on pages that are indexed and consequently, less traffic is generated.

5) The final advantage is the fact that robot.text generator rules can help you in securing the content you publish from scraping.

Six) Free Robots robot.txt generator online tool for bloggers, WordPress & Joomla! Site owners who save their time and effort by removing the need for manually defining robot exclusions and writing complicated rules for robots.

7.) With Robots.text generator, Blogger will further improve its crawling and indexing processes by removing blocked URLs from its system after incorporating robot rules within the robots.text file .