Duplicate content during SEO optimization is a major problem faced by WordPress users. It can be a major factor in your loss of rank with Google, all because of the inherent category and tag pages that are built into WordPress. This simple robots.txt file will help eliminate those errors.
But you can fix this issue by using robots.txt.what is robots txt? Robots txt is a text file, which come in use to keep control over search engines bot while they crawling your blog or website.you can check your robots.txt by typing http://www.yoursite.com/robots.txt . if don’t have create one and uploaded to your website root directory.
Thanks to Mike at WebTricksBlog for this awesome Robots.txt tip.
Just copy and paste the following text into the robots.txt file in the root of your WordPress site. Don’t forget to change the last line of the code to your domain!
# disallow all files in these directories