How to create robots.txt file in blogger
Robots.txt
Robots.txt is file that tells search engines weather, what pages to crawl and what not to crawl.
This is helpful for the websites, that want to block access to some important pages on thier websites
you are telling, search engine not to index those pages.
Default Robots.txt file for blogs.
Sitemap: http://www.youtblog.blogspot.com/feeds/posts/default?orderby=UPDATED
How to set custom robots.txt for your blogs.
User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Allow: /
Sitemap: http://yourblog.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500
This is robots.txt file go to serach prefrences to your blog and Edit "custom robots.tx" and paste it.