Oct 28, 2017

How to Add Custom Robots.txt File in Blogger

How to Add Custom Robots.txt File in Blogger?

   

A custom robots.txt file is a simple text file in which the website owner uses to write the commands for the web crawlers to do some certain activities whether to crawler or not. That commands are written in different coding which can only be read by web crawlers. 

Also Check Out: How to Add Google Custom Search Engine to Blogger Blog



Ankit Singla in his blog master blogging explains Blogger Robots.txt file as a text file which contains few lines of simple code. It is saved on the website or blog's server which instruct the web crawlers on how to index and crawl your blog in the search results.

This indicates that can allow and disallow search crawlers to what area of our blog they have to crawl and what to not. It is one more step to make the blog more SEO friendly. In old blogger interface, there is not an option to add this text file but in Blogger's new interface we can add it easily and today I will guide you how to do it easily. So let’s get started.

Also Read: How to Schedule Blog Posts for Auto Posting in Blogger



You can Check Your Blog Robots.Txt File by this link:


http://www.yourdomain.com/robots.txt

The benefit of adding a robot text file in blogger They are a lot of advantage in adding robot text in your blogger blog some of those are:


  • It makes your site to be easily found in search engines
  • It makes your site article found in the search engine with relevant keyword

Enabling Robots.txt File in Blogger:


This process is so easy than you can imagine all that you need is to follow the simple steps below:

Go To Blogger >> Settings >> Search Preferences

How to Add Custom Robots.txt File in Blogger


Look For Custom Robots.Txt Section In The Bottom and Edit It.

Now a checkbox will appear tick "Yes" and a box will appear where you have to write the robots.txt file. Enter this:

User-agent: Mediapartners-GoogleUser-agent: *Disallow: /search?q=*Disallow: /*?updated-max=*Allow: /Sitemap: http://www.yourdomain.com/feeds/posts/default?orderby=updated


Note: The first line "User-agent: Media partners-Google" is for Google AdSense. So if you are using Google AdSense in your blog then remain it same otherwise remove. it. 

Click "Save Changes".And You Are Done! Now Let’s take a view on the explanation of each of the lines:

User-agent: Media partners-Google: This is a first command which is used for Google AdSense enabled blogs if you are not using Google AdSense then remove it. In this command, we're telling to crawl all pages where AdSense Ads are placed!

User-agent: *: Here the User-agent is calling the robot and * is for all the search engine's robots like Google, Bing, etc.

Disallow: /search?q=*: This line tells the search engine's crawler not to crawl the search pages.

Disallow: /*?updated-max=*: This one disallows the search engine's crawler to do not index or crawl label or navigation pages.

Allow: /: This one allows to index the homepage or your blog.

Sitemap: So this last command tells the search engine's crawler to index the every new or updated post

You can also add your own command to disallow or allow more pages. For Example: if you want to disallow or allow a page then you can use these commands: To Allow a Page:


Allow: /p/contact.html

To Disallow:


Disallow: /p/contact.html


I hope you have learned how to add the robots.txt file in your blogger blog! If you are facing any difficulty, please let me know in the comment box below. It’s your turn to say thanks in comments and keep sharing this post till then Peace, Blessings, and Happy Adding!



Previous Post
Next Post

post written by:

Hey! I’m Muhammad Abba Gana, popularly known as AbbaGana, a blog Scientist by mind and a passionate blogger by heart fountainhead of Guidetricks, Duniyan Fasaha, Duniyar Yau, Hanyantsirah, Gidan Novels, Abba Gana Novels and Be With Me Technologies, I am twenty something year old guy from Jimeta, Adamawa State, Nigeria. I’m a Freelance writer, Information marketer, professional blogger, Web designer, Internet speaker, software Developer and also an author. I make living with my laptop and can work from anywhere I find myself (as long as there is a power supply and a reliable internet connection).

1 comment:

  1. Usually there is certainly the miscalculation or mistake in calculating the overtime that the employee worked
    as a chef that might spark a delay in fully compensating an employee.
    This is where many newbies will quit and seek out a way to earn money from home.
    One sure fire way to realize an enhancement in productivity and profitability would be to implement a well-designed Auto Repair Shop Management software program that fully compliments how we manage your
    shop.

    ReplyDelete

We Cherish Your Comments Most, Kindly Drop your comments below. Don't forget to click "Notify Me" to know if we have responded to your comments, Thank You.