Robots.txt is important on every website or blog because this file will be submit on Google Webmaster tool, this process instruct the google web crawlers on how to index and crawl your website in the search results. Robots.txt is a text which contains few lines of syntax for web crawlers. It is saved on the root directory of the website file system.
Also robots.txt file indicates those page of your site you don’t want accessed by search engine crawlers, but in this blogger seo tutorial purpose we block the search page url only. This robots.txt is standard and no problem at all…
By default in Blogger Blog there’s no text in Custom robots.txt section, so you need create, but the code is ready for you below…
Here’s the robots.txt for blogger…
User-agent: * # Block Disallow: /search # Google AdSense User-agent: Mediapartners-Google Disallow: # Refers to the Homepage Allow: / # Sitemap 1 Sitemap: http://yourblog.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500 # Sitemap 2 Reserve for the future in case my blog post reach to 500, just remove the # # Sitemap: http://yourblog.blogspot.com/atom.xml?redirect=false&start-index=501&max-results=1000
DON’T FORGET TO CHANGE THIS URL TO YOUR BLOG
Explanation of the robots.txt
My version of robots.txt is divided into five sections. Before we add to blogger admin, Let’s read the definition of each of them.
If you have of you with knowledge in programming, asterisk is the nature of character ‘*’ (wildcard). These specifies that this portion (and the lines beneath) is for all for incoming spiders, robots, and crawlers.
Disallow, specifies the ‘not to’ do things for your website, means not to crawl the search pages results of your site. like…
will never be crawled and indexed.
This code is for Google AdSense robots which help to serve better ads on your website. Either if you have AdSense ads on your blog or not simply leave it there.
This code refers to home page that means web crawlers can crawl and index our website or blog’s homepage.
This code refers to the sitemap of your blog. By adding the sitemap we are simply optimizing our blog’s crawling rate, meaning the web crawlers scan the robots.txt file and they will find a path to our sitemap where all the links of our published posts updated.
As you see the robots.txt for blogger, there have a Sitemap 1 and Sitemap 2, if in case your blog post reach to 500, just remove the sharp comment sign (#) like this…
Because blogger xml sitemap generated 1 – 500 per atom url, to continue the number of post for atom just add a site url atom for 501 – 1000 and so on..
Adding Custom robots.txt to Blogger Admin
Go to ‘Blogger Admin’ -> ‘Settings’ -> ‘Search Preferences’ -> ‘Custom robots.txt’ and Add the text and Save Changes or See the screenshot.
After adding the Custom robots.txt you try to preview in web browsers..
This tutorial is part of the Blogger SEO for Beginner Course in My YouTube Channel, Like this post? kindly like and share to your friend, thanks for reading clearly.