Wordpress duplicate content - Solving an SEO problem with Robots.txt

Posted on March 10, 2013
robots-txt-editor

Duplicate content during SEO optimization is a major problem faced by WordPress users. It can be a major factor in your loss of rank with Google, all because of the inherent category and tag pages that are built into WordPress. This simple robots.txt file will help eliminate those errors.  

 

But you can fix  this issue by using  robots.txt.what is robots txt? Robots txt is a text file, which come in use to keep control over search engines bot while they crawling your blog or website.you can check your robots.txt by typing http://www.yoursite.com/robots.txt . if don’t have create one and uploaded to your website root directory.

 

Thanks to Mike at WebTricksBlog for this awesome Robots.txt tip.

Just copy and paste the following text into the robots.txt file in the root of your WordPress site. Don’t forget to change the last line of the code to your domain!

 

Related Posts

  • Paras Sharma

    Thanks vinny for such a won derful post, could you please also let me know how we can allow some popular bots. like bing

  • This topics is very interesting and important for every SEO, thanks for sharing this post.
    best seo service india

Contact Us

Have a question? Send us a message. We'll get back to you soon.

captcha

5 Shares
Share3
Share2
Tweet
+1
Pin
Stumble