Using the Robots.txt Tool in All in One SEO

Are you looking to create a robots.txt file for your site? This article will help.

The robots.txt module in All in One SEO lets you create and manage a robots.txt file for your site that will override the default robots.txt file that WordPress creates.

By creating a robots.txt file with All in One SEO you have greater control over the instructions you give web crawlers about your site.

Like WordPress, All in One SEO generates a dynamic robots.txt so there is no static file to be found on your server. The content of the robots.txt file is stored in your WordPress database and displayed in a web browser.

To get started, click on Tools in the All in One SEO menu.

You should see the Robots.txt Editor and the first setting will be Enable Custom Robots.txt. Click the toggle to enable the custom robots.txt editor.

You should see the Robots.txt Preview section at the bottom of the screen which shows the default rules added by WordPress.

Default Robots.txt Rules in WordPress

The default rules that show in the Robots.txt Preview section (shown in screenshot above) ask robots not to crawl your core WordPress files. It’s unnecessary for search engines to access these files directly because they don’t contain any relevant site content.

If for some reason you want to remove the default rules that are added by WordPress then you’ll need to use the robots_txt filter hook in WordPress.

Adding Rules Using the Rule Builder

The rule builder is used to add your own custom rules for specific paths on your site.

For example, if you would like to add a rule to block all robots from a temp directory then you can use the rule builder to add this.

To add a rule, enter the user agent in the User Agent field. Using * will apply the rule to all user agents.

Next, select either Allow or Disallow to allow or block the user agent.

Next, enter the directory path or filename in the Directory Path field.

Finally, click the Save Changes button.

If you want to add more rules, then click the Add Rule button and repeat the steps above and click the Save Changes button.

Your rules will appear in the Robots.txt Preview section and in your robots.txt which you can view by clicking the Open Robots.txt button.

Editing Rules Using the Rule Builder

To edit any rule you’ve added, just change the details in the rule builder and click the Save Changes button.

Deleting a Rule in the Rule Builder

To delete a rule you’ve added, click the trash can icon to the right of the rule.

Robots.txt Editor for WordPress Multisite

There is also a Robots.txt Editor for Multisite Networks. Details can be found in our documentation on the Robots.txt Editor for Multisite Networks here.

Robots.txt module in All in One SEO The robots.txt module in All in One SEO allows you to set up a robots.txt file for your site that will override the default robots.txt file that WordPress creates. By creating a robots.txt file with All in One SEO Pack you have greater control over the instructions you give web crawlers about your site. Just like WordPress, All in One SEO generates a dynamic file so there is no static file to be found on your server. The content of the robots.txt file is stored in your WordPress database.

Default Rules

The default rules that show in the Create a Robots.txt File box (shown in screenshot above) ask robots not to crawl your core WordPress files. It's unnecessary for search engines to access these files directly because they don't contain any relevant site content. If for some reason you want to remove the default rules that are added by WordPress then you'll need to use the robots_txt filter hook in WordPress.

Adding Rules

The rule builder is used to add your own custom rules for specific paths on your site. For example, if you would like to add a rule to block all robots from a temp directory then you can use the rule builder to add this rule as shown below. To add a rule:

  1. Enter the User Agent. Using * will apply the rule to all user agents
  2. Select the rule type to Allow or Block a robot
  3. Enter the directory path, for example /wp-content/plugins/
  4. Click the Add Rule button
  5. The rule will appear in the table and in the box that shows your robots.txt appears

Adding a Rule in the Robots.txt module

Robots.txt Editor for WordPress Multisite

There is also a Robots.txt Editor for Multisite Networks. Details can be found here. NOTE: Whilst the robots.txt generated by All in One SEO is a dynamically generated page and not a static text file on your server, care should be taken in creating a large robots.txt for two reasons:

  1. A large robots.txt indicates a potentially complex set of rules which could be hard to maintain
  2. Google has proposed a maximum file size of 512KB to alleviate strain on servers from long connection times.