Using the Robots.txt Tool in All in One SEO

Check out our video on how to use the Robots.txt tool here.

Are you looking to create a robots.txt file for your site? This article will help.

The robots.txt module in All in One SEO lets you manage the robots.txt that WordPress creates.

This enables you to have greater control over the instructions you give web crawlers about your site.

About the Robots.txt in WordPress

First, it’s important to understand that WordPress generates a dynamic robots.txt for every WordPress site.

This default robots.txt contains the standard rules for any site running on WordPress.

Second, because WordPress generates a dynamic robots.txt there is no static file to be found on your server. The content of the robots.txt is stored in your WordPress database and displayed in a web browser. This is perfectly normal and is much better than using a physical file on your server.

Lastly, All in One SEO doesn’t generate a robots.txt, it just provides you with a really easy way to add custom rules to the default robots.txt that WordPress generates.

Using the Robots.txt Editor in All in One SEO

To get started, click on Tools in the All in One SEO menu.

Tools menu item in the All in One SEO menu

You should see the Robots.txt Editor and the first setting will be Enable Custom Robots.txt. Click the toggle to enable the custom robots.txt editor.

Click the Enable Custom Robots.txt toggle in the Robots.txt Editor

You should see the Robots.txt Preview section at the bottom of the screen which shows the default rules added by WordPress.

Robots.txt Preview section in the Robots.txt Editor

Default Robots.txt Rules in WordPress

The default rules that show in the Robots.txt Preview section (shown in the screenshot above) ask robots not to crawl your core WordPress files. It’s unnecessary for search engines to access these files directly because they don’t contain any relevant site content.

If for some reason you want to remove the default rules that are added by WordPress then you’ll need to use the robots_txt filter hook in WordPress.

Adding Rules Using the Rule Builder

The rule builder is used to add your own custom rules for specific paths on your site.

For example, if you would like to add a rule to block all robots from a temp directory then you can use the rule builder to add this.

Adding a rule in the robots.txt rule builder

To add a rule, enter the user agent in the User Agent field. Using * will apply the rule to all user agents.

Next, select either Allow or Disallow to allow or block the user agent.

Next, enter the directory path or filename in the Directory Path field.

Finally, click the Save Changes button.

If you want to add more rules, then click the Add Rule button and repeat the steps above and click the Save Changes button.

Your rules will appear in the Robots.txt Preview section and in your robots.txt which you can view by clicking the Open Robots.txt button.

Completed custom robots.txt

Editing Rules Using the Rule Builder

To edit any rule you’ve added, just change the details in the rule builder and click the Save Changes button.

Editing a rule in the Robots.txt Editor

Deleting a Rule in the Rule Builder

To delete a rule you’ve added, click the trash can icon to the right of the rule.

Deleting a rule by clicking the trash can icon in the rule builder

Robots.txt Editor for WordPress Multisite

There is also a Robots.txt Editor for Multisite Networks. Details can be found in our documentation on the Robots.txt Editor for Multisite Networks here.

Here’s a video on how to use the Robots.txt tool in All in One SEO:

Robots.txt module in All in One SEO The robots.txt module in All in One SEO allows you to set up a robots.txt file for your site that will override the default robots.txt file that WordPress creates. By creating a robots.txt file with All in One SEO Pack you have greater control over the instructions you give web crawlers about your site. Just like WordPress, All in One SEO generates a dynamic file so there is no static file to be found on your server. The content of the robots.txt file is stored in your WordPress database.

Default Rules

The default rules that show in the Create a Robots.txt File box (shown in screenshot above) ask robots not to crawl your core WordPress files. It's unnecessary for search engines to access these files directly because they don't contain any relevant site content. If for some reason you want to remove the default rules that are added by WordPress then you'll need to use the robots_txt filter hook in WordPress.

Adding Rules

The rule builder is used to add your own custom rules for specific paths on your site. For example, if you would like to add a rule to block all robots from a temp directory then you can use the rule builder to add this rule as shown below. To add a rule:

  1. Enter the User Agent. Using * will apply the rule to all user agents
  2. Select the rule type to Allow or Block a robot
  3. Enter the directory path, for example /wp-content/plugins/
  4. Click the Add Rule button
  5. The rule will appear in the table and in the box that shows your robots.txt appears

Adding a Rule in the Robots.txt module

Robots.txt Editor for WordPress Multisite

There is also a Robots.txt Editor for Multisite Networks. Details can be found here. NOTE: Whilst the robots.txt generated by All in One SEO is a dynamically generated page and not a static text file on your server, care should be taken in creating a large robots.txt for two reasons:

  1. A large robots.txt indicates a potentially complex set of rules which could be hard to maintain
  2. Google has proposed a maximum file size of 512KB to alleviate strain on servers from long connection times.