by acs111
4 replies
  • SEO
  • |
Hi, Im just doing a site audit and noticed the robots.txt has one command which is # Allow all.

Do you think this is a pretty basic robots file and should we really be adding more code. This is for a large eCommerce equipment hire website.
#robotstxt #seo
  • Profile picture of the author Annie15
    In case of ecommerce website you will be having a lot of search pages, which has no use in getting indexed by search engines. Block pages like search pages, pages with session id in URL.
    {{ DiscussionBoard.errors[8137233].message }}
  • Profile picture of the author leewatson07
    can u please elaborate more on this topic so that i can understand more on this topic.
    {{ DiscussionBoard.errors[8137548].message }}
    • Profile picture of the author acs111
      Originally Posted by leewatson07 View Post

      can u please elaborate more on this topic so that i can understand more on this topic.
      The robots.txt file allows you to specify to the search engines which pages of your website you would like to index.
      Sometimes its preferential to block pages that contain duplicate content or are dynamically generated.
      {{ DiscussionBoard.errors[8137826].message }}
  • Profile picture of the author Hansons
    You can allow all, which comes by default as well...

    For disallowing, you must have knowledge of robots.txt file first..
    Signature

    Is your website Hacked? Try -> www.sitebeak.com
    Is Google Analytics installed Properly? Test -> www.GAtective.com
    Impersonal Google search? Check -> www.impersonal.me

    {{ DiscussionBoard.errors[8138292].message }}

Trending Topics