Blogging

How to Add Custom Robots.txt to Blogger

Custom Robots.txt to Blogger

WHAT IS ROBOTS.TXT FILE

According to Google, custom robots.txt file deals with crawling and indexing blogger/blog spot website. Use this file with relevant code, incorrect use may ignore by Google search engine.

Add custom robots.txt file to allow and disallow to Google search engine.Instruct to search engine to crawl and index specific posts and pages.

Create a robots.txt file with User-agent and Disallow elements.

User-agent: the robot the following rule applies to
Disallow the specific URL, which you want to block being indexed by the search engine.

ADD ROBOTS. TXT TO BLOGGER

Step by step procedure has to follow to add robots.txt to blogger

  • Log in to blogspot with essentials credentials
  • Click on a specific blog.
  • Go to  Settings > Search Preferences > Crawlers and indexing > Custom robots.txt > Edit > Yes.
  • Paste the following custom robots.txt file on your blogger blog.
User-agent: Mediapartners-Google
Disallow:

User-agent: *
Disallow: /search
Allow: /

Sitemap: http://yoursitename.com/feeds/posts/default?orderby=UPDATED

 

* Change your website address instead of using http://yoursitename.com in above code.

To see in live action, check out your blogspot custom robots.txt file by entering following URL.

www.yourwebsitename.com/robots.txt

In our case ( for WordPress website ) this will be www.seotipsandtricks.net/robots.txt.

After Penguin and Hummingbird updates by Google, SEO changes dramatically.Mainly on backlinks and quality links.If you don’t believe external links, then nofollow them to escape from penalized from Google.

Content should be user friendly and don’t write SEO for users.Google always prefer quality content.

About the author

Syed Moin Ali

Hello My Name is Syed Moin Ali and I am the founder of PC Learnings Here on this blog I write about Blogging, SEO, Internet Tricks, Social Networking Site, and Make Money

Add Comment

Click here to post a comment