HomeBlogger Issues

How to Add an SEO Friendly Robots.txt file to Blogger?

Like Tweet Pin it Share Share Email

We are throwing the tutorials regarding blogger SEO. And I recently published tutorials about Custom Redirection in blogger and custom page not found. Also you can find dynamic meta tags setting in blogger on this blog. And thus, I’m continuing the series and today we’ll discuss robots.txt file which is also very important in order to deal with Search engines’ robots. This file is basically a text file which is kept in the site’s root directory if the site is hosted on a manual server. However, in blogger we can use it on our search preferences page, because we don’t have access to the root folder in Blogger.

This file actually can instruct the search robots about which parts of the website/blog can be accessed by them and which can’t be. And one another key point is; if you don’t add this file to your blog or site then simply search robots will follow and index your site as normal. But when you’ve added it to your site or blog then you are responsible for everything whether you’ve instructed to not access the entire site or some specific parts.
Usually, search robots first try to find the robots.txt file in the root directory when they come to crawl your site, and when they don’t find it then they normally index all the files and directories inside the website. But if they find the robot.txt file then they first read it and do as they are instructed in the file.

How to Create a robots.txt file?

As told, this is a file only for search robots and not for human visitors, so only robots will access it. And this is kept in the root directory of any site/blog like this: www.bestbloggercafe.com/robots.txt. So now you can create this file by simply opening a notepad file in your computer and write following codes in it:

User-agent: *
Disallow: /

Now save the file with this name: robots.txt and so it will become acceptable for all robots. And the above example is the simplest example of robots.txt file. The User-agent: means the search robots and the sign (*) is used for all robots i.e Google, Yahoo, Bing etc. And the Disallow:/ means if you don’t want to  allow robots to access any particular page or directory existing on your site/blog. See the another example for disallowing a page from my site:

User-agent: *
Disallow: /images.html

In above robots.txt file we disallowed the robots to not access the page images.html because we don’t want robots to even access this page. And if you want to disallow access to many pages then simply copy and paste Disallow:/ again and again and add the pages you want to disallow. See the example below:

User-agent: *
Disallow: /images.html
Disallow: /plugins.html
Disallow: /fonts.html

In above example we disallowed three pages or directories which were exist on our site. Now search robots won’t be able to access these pages.

Above it was an easy introduction of robots.txt file for you. However, as I mentioned in this post earlier that we’ll discuss a search engine friendly robots.txt file for blogger blogs. So now here I’m giving you a very search engines’ friendly robots.txt file for your blogger blogs. Find the code below for robots.txt file:

User-agent: Mediapartners-Google
User-agent: *
Disallow: /search?q=*
Disallow: /*?updated-max=*
Allow: /
Sitemap: http://www.bestbloggercafe.com/feeds/posts/default?orderby=updated

The above one is very Search engines friendly robots.txt file for blogger. The first line: User-agent: Mediapartners-Google is only for those who use Google adsense on their blogs, so if you don’t use Adsense then remove the first line. The first line is because we added that Adsense’s separate robot is also allowed to access all of our pages.  And the User-agent:* means we allowed all robots to access all pages of our blog, however, in two lines:

Disallow: /search?q=*
Disallow: /*?updated-max=*

We disallowed search robots when some one searches inside our blog by using a search box, so the result of the searched query which is showing sometimes on our own blog shouldn’t be accessed. And also sometimes we use pagination for our blog like page1, page2, page3 and so on, so when the visitors click the Next button or page2 then that page should also not be accessed. Because that’s not search engine friendly thing.
Also we added Sitemap of our blog in our robots.txt so that robots can easily be notified when we update our blog. You just need to replace the address: www.bestbloggercafe.com with your own blog’s address.

How to add robots.txt file to Blogger?

Now when you understood the principles of using this file then you can easily add it to blogger. See the instructions for adding robots.txt file to blog:

  1. Go to blogger Dashboard
  2. Setting >> Search Preferences >> Crawling & Indexing
  3. Find Custom robots.txt >> Edit And you’ll see a situation like below image:
When you found this box, just copy the above give code and paste that inside this box and save save the changes. And you’ve successfully added a search engines’ friendly robots.txt file to your blog.
Note: Don’t worry about the errors in Google Webmaster tools, because after adding this file you might see in your Webmaster tools that 50 or 60 or similar pages were blocked by robots.txt file. So that doesn’t affect your blog’s ranking. Also, you can always remove these codes and anything won’t happen to your blog.
I hope I’ve given you enough information regarding robots.txt file. And if you still face any difficulty then surely let me know by commenting on this post. Till next tutorial take a lot of care of yourself and your family.

Comments (12)

Speak Your Mind