Robots, Wordpress and Duplicate Content

by 4 replies
5
Hi gang,

Wordpress and all its subdirectories are new to me.... Is there a standard robots.txt people use when they use wordpress so that only the pages and posts are indexed?

So far I've got:

User-agent: *
Disallow: /wp-
Disallow: /feed/
Disallow: /trackback/
Disallow: /cgi-bin/
But think I should also have

Disallow: /tag/
Disallow: /category/
Disallow: /author/
Disallow: /admin/

And something for archives... /archives/? Maybe?

Any suggestions?

Thanks,
Stephen
#search engine optimization #content #duplicate #robots #wordpress
  • Let everything be indexed even your contact page, privacy policy etc.

    Having too many restrictions can affect your site performance.
    • [1] reply
    • I should restrain SE bot crawl pages like terms of service and privacy policy and about us, since this dilutes the link juice passed between your important web pages. Just a suggestion...

      If you're new to wordpress just use All in One SEO pack plugin and your SEO will be taken care of; as your duplicate content issue.
  • why would you want to?

    lots of those are important
    • [1] reply
    • I have no reason too... just searching the web and found some suggestions... so I came here to verify!

Next Topics on Trending Feed