5 Best WordPress Appointment and Booking Plugins
April 3, 2015
Sydney – A Beautiful Free Business WordPress Theme from aThemes
April 7, 2015

How to Optimize Your WordPress Robots.txt for SEO

Do you want to optimize your WordPress robots.txt file? Not sure why and how robots.txt file is important for your SEO? We have got you covered. We will show you how to optimize your WordPress robots.txt for SEO and help you understand the importance of robots.txt file in this article.

Recently, a user asked us if they need a robots.txt file and what is the importance of it? An important role in your site’s overall SEO performance are played by your site’s robots.txt file . It basically allows you to communicate with search engines and let them know which parts of your site they should index.

Understanding robots.txt  importance in WordPress SEO

Do I Really Need a Robots.txt File?

Absence of a robots.txt file will not stop Search engines from crawling and indexing your website. However, it is highly recommended that you create one. This is where search engines will look for your XML sitemap unless you have specified it in Google Webmaster Tool  if you want to submit your site’s XML sitemap to search engines.

We highly recommend that if you do not have a robots.txt file on your site, then you immediately create one.

Where is the Robots.txt file? How to Create a Robots.txt file?

Robots.txt file usually resides in your site’s root folder. You will need to connect to your site using an FTP client or by using cPanel file manager to view it.

You can open it with a plain text editor like Notepad because it is just like any ordinary text file.

You can always create one if you do not have a robots.txt file in your site’s root directory. All you need to do is create a new text file on your computer and save it as robots.txt. Next, simply upload it to your site’s root folder.

How to Use Robots.txt file?

The format for robots.txt file is actually quite simple. The first line usually names a user agent. The name of the search bot you are trying to communicate with , is actualy the user agent. For example, Googlebot or Bingbot. You can use asterisk * to instruct all bots.

The next line follows with Allow or Disallow instructions for search engines, so they know which parts you want them to index, and which ones you don’t want indexed.

See a sample robots.txt file:

1 User-Agent: *
2 Allow: /wp-content/uploads/
3 Disallow: /wp-content/plugins/
4 Disallow: /readme.html

In this sample robots.txt file for WordPress, we have instructed all bots to index our image upload directory.

In the next two lines we have disallowed them to index our WordPress plugins directory and the readme.html file.

Optimizing Your Robots.txt File for SEO

To hide low quality content in the guidelines for webmasters, Google advises webmasters to not use robots.txt file . If you were thinking about using robots.txt file to stop Google from indexing your category, date, and other archive pages, then that may not be a wise choice.

Remember, the purpose of robots.txt is to instruct bots what to do with the content they crawl on your site. It does not stop bots from crawling your website.

There are other WordPress plugins which allow you to add meta tags like nofollow and noindex in your archive pages. WordPress SEO plugin also allows you to do this. We are not saying that you should have your archive pages deindexed, but that’s the proper way of doing it if you wanted to do it.

The proper way of adding noindex to archive pages in WordPress

Because login and registration pages have noindex tag added as meta tag by WordPress you do not need to add your WordPress login page, admin directory, or registration page to robots.txt.

It is recommend that you disallow readme.html file in your robots.txt file. Someone who is trying to figure out which version of WordPress you are using can use this readme file. They can easily access the file by simply browsing to it if this was an individual.

On the other hand if someone is running a malicious query to locate WordPress sites using a specific version, then this disallow tag can protect you from those mass attacks.

You can also disallow your WordPress plugin directory. This will strengthen your site’s security if someone is looking for a specific vulnerable plugin to exploit for a mass attack.

Adding Your XML Sitemap to Robots.txt File

Your plugin will try to automatically add your sitemap related lines into robots.txt file if you are using Yoast’s WordPress SEO plugin or some other plugin to generate your XML sitemap.

However if it fails, then your plugin will show you the link to your XML Sitemaps which you can add to your robots.txt file manually like this:

What Does an Ideal Robots.txt File Should Look Like?

Honestly, many popular blogs use very simple robots.txt files. Their contents vary, depending on the needs of the specific site:

1 User-agent: *
2 Disallow:

This robots.txt file simply tells all bots to index all content and provides the links to site’s XML sitemaps.

Here is another example of a robots.txt file, this time it is the one we use here on WPBeginner:

01 User-Agent: *
02 Allow: /?display=wide
03 Allow: /wp-content/uploads/
04 Disallow: /wp-content/plugins/
05 Disallow: /readme.html
06 Disallow: /refer/

That’s all. We hope this article helped you learn how to optimize your WordPress robots.txt file for SEO.

Welcome to Wordpress Mind! We aim to be a source of quality information free WordPress themes for WordPress lovers all around the world.

Leave a Reply

Your email address will not be published. Required fields are marked *

CAPTCHA HERE * Time limit is exhausted. Please reload the CAPTCHA.