How to optimize your WordPress Robots.txt for SEO

Recently one of our readers asked us for tips on how to optimize the robots.txt file to improve SEO.

The Robots.txt file tells search engines how to crawl your website, which makes it an incredibly powerful SEO tool.

In this article, we are going to show you how to create a perfect robots.txt file for SEO.

What is the robots.txt file?

Robots.txt is a text file that website owners can create to tell search engine bots how to crawl and index pages on their website.

It is usually stored in the root directory of your website, also known as the main folder. The basic format for a robots.txt file looks like this:

User agent: [user-agent name]
Do not allow: [URL string not to be crawled]

User agent: [user-agent name]
Enable: [URL string to be crawled]

Sitemap: [URL of your XML Sitemap]

You can use multiple lines of instructions to allow or disallow specific URLs and add multiple sitemaps. If you don’t ban a URL, search engine bots will assume they are allowed to crawl it.

An example file for robots.txt could look like this:

User-Agent: * Allow: / wp-content / uploads / Disallow: / wp-content / plugins / Disallow: / wp-admin / Sitemap: https://example.com/sitemap_index.xml

In the robots.txt example above, we have allowed search engines to crawl and index files in our WordPress upload folder.

After that, we excluded search bots from crawling and indexing plugins and WordPress admin folders.

Finally, we entered the URL of our XML sitemap.

Do you need a Robots.txt file for your WordPress site?

If you don’t have a robots.txt file, search engines will continue to crawl and index your website. However, they cannot tell search engines which pages or folders they should not crawl.

This doesn’t have much of an impact if this is your first time blogging and you don’t have a lot of content.

However, as your website grows and you have a lot of content, you probably want more control over how your website is crawled and indexed.

Here’s why.

Search bots have a crawl quota for every website.

This means that they crawl a certain number of pages during a crawl session. If they don’t crawl all of the pages on your site, they come back and continue crawling in the next session.

This can slow your website’s indexing rate.

You can fix this by preventing search bots from trying to crawl unnecessary pages like your WordPress administration pages, plugin files, and topic folders.

By not allowing unnecessary pages, you save your crawl quota. This allows search engines to crawl even more pages on your website and index them as quickly as possible.

Another good reason to use the robots.txt file is when you want to prevent search engines from indexing a post or page on your website.

This isn’t the safest way to hide content from the public, but you can keep it from appearing in search results.

What does an ideal Robots.txt file look like?

Many popular blogs use a very simple robots.txt file. Their content may vary depending on the requirements of the particular website:

User agent: * Do not allow: Sitemap: http://www.example.com/post-sitemap.xml Sitemap: http://www.example.com/page-sitemap.xml

This robots.txt file enables all bots to index all of its content and provides them with a link to the website’s XML sitemaps.

For WordPress sites, we recommend the following rules in the robots.txt file:

User-Agent: * Allow: / wp-content / uploads / Disallow: / wp-content / plugins / Disallow: / wp-admin / Disallow: /readme.html Disallow: / refer / Sitemap: http: //www.example .com / post-sitemap.xml sitemap: http://www.example.com/page-sitemap.xml

This will tell search bots to index all WordPress images and files. It prevents search bots from indexing WordPress plugin files, the WordPress admin area, the WordPress readme file, and affiliate links.

By adding sitemaps to the robots.txt file, Google bots can easily find all the pages on your website.

Now that you know what an ideal robots.txt file looks like, let’s see how you can create a robots.txt file in WordPress.

How do I create a Robots.txt file in WordPress?

There are two ways to create a robots.txt file in WordPress. You can choose the method that works best for you.

Method 1: editing the Robots.txt file with all-in-one SEO

All in One SEO, also known as AIOSEO, is the best WordPress SEO plugin out there, used by over 2 million websites.

It’s easy to use and comes with a robots.txt file generator.

If you have not yet installed the AIOSEO plugin, you will find our step-by-step instructions on how to install a WordPress plugin.

Note: A free version of AIOSEO is also available and has this feature.

As soon as the plugin is installed and activated, you can use it to create and edit your robots.txt file directly in your WordPress administration area.

Just go to All in one SEO tool to edit your robots.txt file.

AIOSEO robots.txt editor

First, you need to enable the edit option by clicking “Enable Custom Robots.txt” and clicking “Blue”.

This option allows you to create a custom robots.txt file in WordPress.

AIOSEO enables custom robots.txt

All in One SEO displays your existing robots.txt file in the “Robots.txt Preview” section at the bottom of the screen.

This version shows the default rules added by WordPress.

Standard rules for Robots.txt

These standard rules tell the search engines not to crawl your core WordPress files, allow the bots to index all of the content, and provide them with a link to your site’s XML sitemaps.

Now you can add your own custom rules to improve your robots.txt for SEO.

To add a rule, enter a user agent in the User Agent field. If you use a *, the rule will be applied to all user agents.

Then choose whether or not to allow the search engines to crawl.

Next, enter the file name or directory path in the Directory Path field.

Add rule in robots.txt

The rule is automatically applied to your robots.txt. To add another rule, click the “Add rule” button.

We recommend adding rules until you have the ideal robots.txt format that we shared above.

Your custom rules look like this.

Custom rule file Robots.txt

When you’re done, don’t forget to click the “Save Changes” button to save your changes.

Method 2. Manually edit the Robots.txt file via FTP

This method requires you to use an FTP client to edit the robots.txt file.

Simply connect to your WordPress hosting account using an FTP client.

Once inside, you can view the robots.txt file in the root folder of your website.

FTP connection robots.txt

If you don’t see one, you probably don’t have a robots.txt file.

In that case, you can just create one.

Create FTP connection robots.txt

Robots.txt is a plain text file. So you can download them to your computer and edit them with any plain text editor like Notepad or TextEdit.

After you’ve saved your changes, you can upload them back to your website’s root folder.

How do I test my Robots.txt file?

Once you’ve created your robots.txt file, it’s always a good idea to test it with a robots.txt tester tool.

There are many robots.txt testing tools out there, but we recommend using that in the Google Search Console.

First of all, your website must be linked to the Google Search Console. If you haven’t already, check out our guide on adding your WordPress site to the Google Search Console.

You can then use the Google Search Console robot testing tool.

Select the website property robots.txt tester

Simply select your property from the drop-down list.

The tool will automatically retrieve your website’s robots.txt file and highlight the errors and warnings if found.

Results of the Robots.txt tester

Final thoughts

The goal of optimizing your robots.txt file is to prevent search engines from crawling pages that are not publicly available. For example pages in your wp-plugins folder or pages in your WordPress administrator folder.

A common myth among SEO pros is that blocking WordPress categories, tags, and archive pages improves crawl rate and results in faster indexing and higher rankings.

That is not true. It also violates Google’s webmaster guidelines.

We recommend that you use the robots.txt format above to create a robots.txt file for your website.

We hope this article has helped you learn how to optimize your WordPress robots.txt file for SEO. You might also want to see our ultimate WordPress SEO guide and best WordPress SEO tools for growing your website.

If you enjoyed this article, please subscribe to our YouTube channel for WordPress video tutorials. You can also find us on Twitter and Facebook.

Comments are closed.