How to add custom Robots.txt file into blogger - Search Engine Optimization - The WonderSpot - Know all in one spot
Breaking News:

How to add custom Robots.txt file into blogger - Search Engine Optimization

How to add custom robots.txt file in blogger

Today we discuss about how to add custom robots.txt file in blogger.

Robot. txt is a text file that telling to the search engine (servers) how for you to crawl that blog site. Robot. txt file give instruction to everyone search engine crawlers and actually tell them which a part of your need to be access for robots and which regions of your blog need to be blocked forever from indexing. So by including this file in blogger it is going to bring remarkable change inside your blog traffic. Add custom robots.txt file in blogger in blogger is one more step to make blog more SEO friendly.

How to add custom Robots.txt file into blogger - Search Engine Optimization

Now custom robots.txt file is available for bloggers. By using that website owner able to write the commands for the web crawlers to what to crawler or What not to crawler. That commands are written by different coding which can only be read by web crawlers only. The pages that are restricted in robots.txt file will won’t be crawled and indexed in search results, then you can stop bots from crawling unnecessary area of your site. However, all those page are viewable publicly to normal humans.

Each Blogger blog will have a robots.txt file that comes by default and it looks something like the one below. You can check your own blogs robots.txt file by adding /robots.txt next to your domain name. ( , then you can see  below codes. That is the default Robots.txt of most blog sites. 

User-agent: Mediapartners-Google
User-agent: *
Disallow: /search
Allow: /

What are these Robots.txt files?

As you can see above the default robots.txt file has few things like user-agent, media partners-Google, user-agent:*, disallow and site map. If you are still not aware about these, then here is the explanation.
First you need to know about User agent which is a software agent or client software that will act on behalf of you.

Mediapartners-Google; Media partner, Google is the user agent for Google AdSense that is used to server better relevant ads on your site based on your blog content. So if you disallow this they you will won’t able to see any ads on your blocked pages.

User-agent: * ; So you all know what user-agent is, so what is user-agent:*. The user-agent that is marked with (*) asterisk is applicable to all crawlers and robots that can be Bing robots, affiliate crawlers or any client software it can be. Simply the user agent may be all the search engine crawlers

Disallow: By adding disallow you are telling robots not to crawl and index the pages.

Disallow: /search; which means you are disallowing your blogs search results by default. You are disallowing crawlers in to the directory /search that comes next after your domain name. That is a search page like will not be crawled and never be indexed.
In Blogger search option is related with labels. If you are not using labels wisely per post, you should disallow crawl of search link. That request all the search engine crawlers to exclude the search.

Allow; Allow: / simply refers to or you are specifically allowing search engines to crawl the home page.

Sitemap; In this Robots.txt you can also write the location of your sitemap file. Sitemap is the file located on server which contains all posts’ permalinks of your blog.  Sitemap helps to crawl and index all your accessible pages and so in default robots.txt you can see that your blog specifically allowing crawlers in to sitemaps. There is an issue with default Blogger sitemap, so learn how to create sitemap in Blogger and notify search engines.
Blogger is reading sitemap entries through feed, by this method most recent 25 posts are submitted to search engines. With that above robots.txt code, search engines bots only work on most recent 25 posts in your blog.

What pages should you disallow in Blogger?

This question is little tricky and we cannot predict what pages to allow and what to disallow in your Blog. You can disallow pages like privacy policy, Terms & conditions, cloaked affiliate links, labels as well as search results and that depends all upon you. Since you get some reasonable traffic from search results it is not recommended that you to disallow the labels page, privacy policy page and TOS page.

How to disallow pages in Blogger using robots.txt

You can easily disallow search engines to crawl and index particular pages or posts in Blogger using robots.txt file.

We don’t have the reason to block search engines on any particular posts and if you wish so then just add Disallow: /year/month/
your-post-url.html in your robots.txt file
. That is copy your post URL next to your domain name and add it in your robots.txt file.

Same what you will need to do for disallowing any particular pages. Copy the page URL next to your domain name and add it like this
Disallow: /p/your-page-name.html in your robots.txt file.

To Allow a Page:
Allow: /p/contact.html

To Disallow:
Disallow: /p/contact.html

Ok. then let see how to add custom robots.txt file in blogger.

How to add custom robots.txt file in Blogger? 

1.      Log into your Blogger blog
2.      Go to dash board.
3.      Then select Settings  
4.      Now click on Search Preferences
5.      Look for Custom Robots.Txt Section in The Bottom and Edit It.

6.       Now a check box will appear then tick Yes and a box will appear where that you have to write the robots.txt file.
Just enter below codes,

User-agent: Mediapartners-Google
User-agent: *
Disallow: /search?q=*
Disallow: /*?updated-max=*
Allow: /

*Replace the  with your Blog address or custom domain.

If you want search engine bots crawl most recent 500 posts then you should need to replace the below code
With  Sitemap:

Sitemap: https://www.yourdomain. com/atom.xml?redirect=false&start-index=1&max-results=500

If you already have more than 500 posts then you have to add one more sitemap line, below the above code
Sitemap: https://www.yourdomain. com/atom.xml?redirect=false&start-index=501&max-results=500

*Replace the  with your Blog address or custom domain.

If you have organized post labels in a well format and after good experienced in SEO then you can remove following line.
Disallow: /search

4. Finally click save changes.

Once done click save changes. Now to check your robots.txt just add /robots.txt at the end of your blog URL and you can see your custom robots.txt file. After adding your custom robots.txt file you can submit your blog to search engines.
Then I think you have a good understand about robots.txt feature in your blogger blog.

*Recommend you to use the Google search console Robots.txt tester to test  the Robots.txt for any errors or warnings.

Ok my friends. I think now you have a better idea about how to add custom robots.txt file in blogger.
Now you can get organic traffic.
Enjoy your blogging

Post a Comment