How to Edit & Optimize WordPress Robots.txt File for SEO

Optimizing WordPress Robots.txt File for SEO

Have you optimized your WordPress Robots.txt file for SEO?

If you haven’t, you are ignoring an important aspect of SEO. Robots.txt file plays a significant role on your site’s SEO.

You are lucky that WordPress automatically creates a Robots.txt file for you. Having this file is half of the battle. You have to make sure that Robots.txt file is optimized to get the full benefits.

Robots.txt file tells search engine bots what pages to crawl and what pages to avoid. In this post, I will show you how to edit and optimize Robots.txt file in WordPress.

What is Robots.txt File?

Let’s start with basic.

Robots.txt file is a text file which instructs search engine bots how to crawl and index a site. Whenever any search engine bots come to your site, it reads the robots.txt file and follow the instructions. By using this file, you can specify bots which part of your site to crawl and which part to avoid. However, the absence of robots.txt will not stop search engine bots to crawl and index your site.

Editing & Understanding Robots.txt in WordPress

I’ve already said that every WordPress site has a default robots.txt file in root directory. You can check your robots.txt by going to http://yourdomain.com/robots.txt. For example, you can check our robots.txt file from here: https://roadtoblogging.com/robots.txt

If you don’t have a robots.txt file, you’ll have to create one. It’s very easy to do. Just create a text file in your computer and save it as robots.txt and upload it to your root directory. You can upload it via FTP Manager or cPanel File Manager.

Now let’s see how to edit your robots.txt file.

You can edit your robots.txt file by using FTP Manager or cPanel File Manager. But it’s time-consuming and a bit difficult.

The best way to edit Robots.txt file is, using a plugin. There are several WordPress robots.txt plugins out there. I prefer Yoast SEO. This is the best SEO plugin for WordPress. I’ve already shared how to set up Yoast SEO.

Yoast SEO allows you to modify the robots.txt file from your WordPress admin area. However, if you don’t want to use Yoast plugin, you can use other plugins like WP Robots Txt.

Once you’ve installed and activated Yoast SEO plugin, go to WordPress Admin Panel > SEO > Tools.
Yoast SEO Tools

Then click on “File editor”.

File Editor

Then you need to click on “Create robots.txt file”.

Create robots.txt file

Then you will get the Robots.txt file editor. You can configure your robots.txt file from here.

Robots.txt file editor

Before editing the file, you need to understand the commands of the file. There are three commands mainly.

  • User-agent – Defines the name of the search engine bots like Googlebot or Bingbot. You can use an asterisk (*) to refer to all search engine bots.
  • Disallow – Instructs search engines not to crawl and index some parts of your site.
  • Allow – Instructs search engines to crawl and index which parts you want to index.

Here’s a sample of Robots.txt file.

User-agent: *
Disallow: /wp-admin/
Allow: /

This robots.txt file instructs all search engine bots to crawl the site. In the 2nd line, it tells search engine bots not to crawl the /wp-admin/ part. In 3rd line, it instructs search engine bots to crawl and index whole website.

Configuring & Optimizing Robots.txt File for SEO

A simple misconfigure in Robots.txt file can completely deindex your site from search engines. For example, if you use the command “Disallow: /” in Robots.txt file, your site will be deindexed from search engines. So you need to be careful while configuring.

Another important thing is optimization of Robots.txt file for SEO. Before going to the best practices of Robots.txt SEO, I’d like to warn you about some bad practices.

  • Don’t use Robots.txt file to hide low-quality contents. The best practice is to use noindex and nofollow meta tag. You can do this by using Yoast SEO plugin.
  • Don’t use Robots.txt file to stop search engines to index your Categories, Tags, Archives, Author pages, etc. You can add nofollow and noindex meta tags to those pages by using Yoast SEO plugin.
  • Don’t use Robots.txt file to handle duplicate content. There are other ways.

Now let’s see how you can make Robots.txt file SEO friendly.

  1. At first, you need to determine which parts of your site you don’t want search engine bots to crawl. I prefer disallowing /wp-admin/, /wp-content/plugins/, /readme.html, /trackback/.
  2. Adding “Allow: /” derivatives on Robots.txt file is not so important as bots will crawl your site anyway. But you can use it for the particular bot.
  3. Adding sitemaps to Robots.txt file is also a good practice. Read: How to create Sitemap

Here’s an example of ideal Robots.txt file for WordPress.

User-agent: *
Disallow: /wp-admin/
Disallow: /wp-content/plugins/
Disallow: /readme.html
Disallow: /trackback/
Disallow: /go/
Allow: /wp-admin/admin-ajax.php
Allow: /wp-content/uploads/
Sitemap: https://roadtoblogging.com/post-sitemap.xml
Sitemap: https://roadtoblogging.com/page-sitemap.xml

You can check RTB Robots.txt file here: https://roadtoblogging.com/robots.txt

Testing Robots.txt File in Google Webmaster Tools

After updating your Robots.txt file, you have to test the Robots.txt file to check if any content is impacted by update.

You can use Google Search Console to check if there is any “Error” or “Warning” for your Robots.txt file. Just login to Google Search Console and select the site. Then go to Crawl > robots.txt Tester and click on “Submit” button.

robots.txt Tester

A box will be popped up. Just click on “Submit” button.

update robots.txt file

Then reload the page and check if the file is updated. It might take some time to update the Robots.txt file.

If it hasn’t updated yet, you can enter your Robots.txt file code into the box to check if there are any errors or warnings. It will show the errors and warnings there.

robots.txt errors warnings

If you notice any errors or warnings in the robots.txt file, you have to fix it by editing the robots.txt file.

Final Thoughts

I hope this post helped you to optimize your WordPress robots.txt file. If you have any confusion regarding, feel free to ask us via comments.

However, if you want to make your WordPress blog SEO friendly, you can read our post on How to Setup WordPress Yoast SEO Plugin.

If you find this post helpful, please help me by sharing this post on Facebook, Twitter or Google+.

Sharing is Caring
Istiak Rayhan
 

Istiak Rayhan is the founder of RoadToBlogging.com, a blog that aims to make bloggers' journey easier. Istiak loves to help newbie bloggers to build a better blog. Here's more about him.

  • Joseph Jones says:

    This is very interesting. I thought that I really need to go to the hosting or doing some ftp stuff to be able to do some robots.txt tweaks. Thanks for sharing Istiak.

  • Vikash Sharma says:

    Hi Istiak,

    A very Useful and informative article. This will really help a lot of people and newbie like me. I am very new to blogging and was not aware of the Robots.txt file. but after reading this article. I have created and setup robots.txt file for my blog http://www.freakabouthealth.com

    Thank you so much for sharing such a detailed article with screenshot which made it more useful. You know, This is my first to this blog and have already spent 3 hours on your blog. I have found lots of useful articles here. Your blog has been saved in my favourite now. I will visit this regularly to learn some new things.

    I would really Thank you for all your time to write and share such valuable articles and guides with us that too free of cost.
    Keep it Up and ALL THE BEST :)

    Regards,
    Vikash Sharma

    • Istiak Rayhan says:

      Thanks for your kind words. Let me know if you need any help.

  • Riju Debnath says:

    Hi Istiyak
    I have already done the setup of Yoast SEO with the help of your previous post. Now from this post I am going to copy your ideal model of robots.txt file into my site. Thanks for such a detailed guide on robots.txt. All my confusions are gone now.

    • Istiak Rayhan says:

      Glad I could help.

  • Hello brother Istiak,
    I got confused about robots.txt file. I know it is very important for crawling the site. Could you tell me- is the creating procedure system is same for any other CMS or any programming language like PHP? However thanks for sharing the knowledge. It will clear the importance of having a robots.txt file.

    • Istiak Rayhan says:

      Yes, it’s same. Just create a robots.txt file on your computer and upload it to your root directory.

  • Pramod Kumar says:

    Thanks, but there is no need to block any Wp directory because if you blocked /Wp-content/plugins/ or /Wp-includes/ then search engines will not crawl your pages properly. So it’s good. Should not block these two folders.

    • Istiak Rayhan says:

      Thanks for the suggestions. Blocking /wp-content/plugins/ is not a problem as this folder only contains plugins. But blocking /wp-includes/ can be a little bit risky as it may block some scripts from bots. So yes, it’s a good idea not to disallow the /wp-includes/ folder.

  • Thanks for the article. I used bits and pieces and so far it seems to be working rather well. I didn’t know much about robots.txt before this, but finally got a working one that registers in Google’s console, again thank you. I had to leave out disallowing /wp-includes/ as this seemed to break the site in Google’s console.

    • Istiak Rayhan says:

      Glad I could help.

  • John says:

    seem to be this function is removed on new wordpress seo plugin.
    is there other way to edit it on admin page?

    • Istiak Rayhan says:

      It’s still there. Check again.

  • Kirstov says:

    Istiak good summary, maybe you can try our free plugin if interested

    https://wordpress.org/plugins/virtual-robotstxt-littlebizzy/

    that can work on Nginx or Apache servers too.

  • >