Do you want to know how to edit a robots.txt file in WordPress?
A robots.txt file is a powerful SEO tool since it works as a website guide for search engine crawl bots or robots. Telling bots to not crawl unnecessary pages can increase the load speed of your site and improve rankings in search engines.
In this post, we’ll show you how to edit your robots.txt file in WordPress step by step. We’ll also cover what a robots.txt file is and why it’s important.
Feel free to click on these quick links to jump straight to different sections:
- What Is a Robots.txt File?
- Why Is the Robots.txt File Important?
- Edit Robots.txt in WordPress Using AIOSEO
What Is a Robots.txt File?
A robots.txt file tells search engines how to crawl your site — where they’re allowed to go or not.
Search engines like Google use these web crawlers, sometimes called web robots, to archive and categorize websites.
Most bots are configured to search for a robots.txt file on the server before it reads any other file from your site. It does this to see if you’ve added special instructions on how to crawl and index your site.
And the robots.txt file is typically stored in the root directory, also known as the main folder of your website.
The URL can look like this:
To check the robots.txt file for your own website, you simply replace the http://www.example.com/ with your domain and add robots.txt at the end.
Now, let’s take a look at how the basic format for a robots.txt file looks like:
User-agent: [user-agent name] Disallow: [URL string not to be crawled] User-agent: [user-agent name] Allow: [URL string to be crawled] Sitemap: [URL of your XML Sitemap]
For this to make any sense, we first need to explain what User-agent means. It’s basically the name of the search engine bot or robot that you want to block or allow to crawl your site (for example the Googlebot crawler).
Second, you can include multiple instructions to either Allow or Disallow specific URLs, plus adding multiple sitemaps. Like you’ve probably figured out, the disallow option tells search engine bots not to crawl these URLs.
Default Robots.txt File in WordPress
By default, WordPress automatically creates a robots.txt file for your site. So even if you don’t lift a finger, your site should already have the default WordPress robots.txt file.
But when you later customize it with your own rules, the default content is replaced.
Here’s how the default WordPress robots.txt file looks:
User-agent: * Disallow: /wp-admin/ Allow: /wp-admin/admin-ajax.php
The asterisk after User-agent: * means that the robots.txt file is for all web robots that visit your site. And like mentioned, the Disallow: /wp-admin/ tells robots to not visit your wp-admin page.
You can test your robots.txt file by adding /robots.txt at the end of your domain name. For example, if you enter “https://aioseo.com/robots.txt” in your web browser it shows the robots.txt file for AIOSEO:
Now that you know what a robots.txt file is and the basics of how it works, let’s take a look at why the robots.txt file matters in the first place.
Why Is the Robots.txt File Important?
The robots.txt file is important if you want to:
- Optimize Your Site Load Speed — by telling bots to not waste time on pages you don’t want them to crawl and index, you can free up resources and increase your site load speed.
- Optimizing Your Server Usage — blocking bots that are wasting resources, will clean up your server and reduce 404 errors.
When to Use the Meta Noindex Tag Instead of Robots.txt
However, if your primary goal is to stop certain pages from being included in search engine results, the proper approach is to use a meta noindex tag.
This is because the robots.txt is not directly telling search engines not to index content – it’s just telling them not to crawl it.
In other words, you can use robots.txt to add specific rules for how search engines and other bots interact with your site, but it won’t explicitly control whether your content is indexed or not.
With that said, let’s show you how to easily edit your robots.txt file in WordPress step by step using AIOSEO.
Edit Robots.txt in WordPress Using AIOSEO
The easiest way to edit the robots.txt file is by using the best WordPress SEO plugin, All in One SEO (AIOSEO). It allows you to take control of your website and configure a robots.txt file that will override the default WordPress file.
If you didn’t know this already, AIOSEO is a complete WordPress SEO plugin, which lets you optimize your content for search engines and increase rankings with just a few clicks. Check out our powerful SEO tools and features here.
Enable Custom Robots.txt
To get started editing your robots.txt file, click on Tools in the All in One SEO menu, and then click on the Robots.txt Editor tab.
AIOSEO will then generate a dynamic robots.txt file. Its content is stored in your WordPress database and can be viewed in your web browser as we’ll show you in a bit.
Once you’ve entered the Robots.txt Editor, you need to Enable Custom Robots.txt.
Click on the button so it turns blue.
You’ll then see the Robots.txt Preview section at the bottom of the screen, which shows the WordPress default rules that you can overwrite with your own.
The default rules tell robots not to crawl your core WordPress files (admin pages). It’s also recommended to not crawl plugins and themes. They don’t include any relevant content and are unnecessary for search engines to crawl.
Now, let’s move on to how you can add your own rules using the rule builder.
Adding Rules Using the Rule Builder
The rule builder is used to add your own custom rules for what pages the robots should crawl or not.
For instance, if you’d like to add a rule that blocks all robots from a temp directory (meaning a temporary folder on for example a hard drive), you can use the rule builder to do this.
Like in this example:
To add a custom rule, simply enter the User Agent (for example Googlebot crawler) in the User Agent field. Or you can use the * symbol that will make your rule apply to all user agents (robots).
Next, select either Allow or Disallow to allow or block the User Agent.
After you’ve decided what bots to allow or disallow, you need to enter the directory path or filename in the Directory Path field.
Once this is done, you can go ahead and click the Save Changes button in the right bottom corner of the page.
And if you want to add more rules, you can click the Add Rule button and repeat the steps above.
Don’t forget to save your changes when you’re done.
Anyway, when you’ve saved your new rules, they’ll appear in the Robots.txt Preview section.
To view your robots.txt file, you simply click on the Open Robots.txt button.
Now, let’s take a look at how you can edit your rules next.
Editing Rules Using the Rule Builder
To edit your rules, you can just change the details in the rule builder and click on the Save Changes button.
It’s very easy! And so is deleting rules like we’ll show you next.
Deleting a Rule in the Rule Builder
To delete a rule, simply click the trash can icon to the right of the rule.
As simple as that!
Before leaving this topic, we want to let you know that in case you need to edit your robots.txt file for multisite networks, you can check out our documentation on how to do that here.
We hope this tutorial showed you how to easily edit a robots.txt file in WordPress. Now, go ahead and add your own rules, and you’ll make sure your website is optimized for optimal performance in no time.
One more thing…
…do you think Google Search Console is confusing? It doesn’t have to be!
Check out our guide on how to easily verIfy your WordPress site using AIOSEO.
Stay tuned for AIOSEO’s new features and improvements.