1 replies
  • SEO
  • |
I have what is probably going to be a really stupid question about the robots.txt file. My basic understanding is that if there is no robots file than the spider assumes all the pages and files should be crawled? I have never had to deal with this type of thing before, all the programmers that I have ever worked with always set this up correctly, but on a couple of my new affiliate sites I am getting the error status in the google webmaster tools that i am "missing robots.txt" and i'm totally confused.

i will appreciate any advice or information...thanks!
#question #robots
  • Profile picture of the author paulgl
    They are probably trying to go to robots.txt, but since it does
    not exist, it is returning an error. Probably returning an error
    like a 404 error or something. Virtually anytime anything like google
    crawls your site, it will look for a robots.txt page and try and access
    it. No matter what. It will assume it is there, and, when not found it
    gives the error.

    The advice is to create one. The directions can be found by
    googling it. It's actually pretty simple and you can cut and paste
    the code to suit your needs.

    But then, if you don't have one you don't need one. It just optimizes
    as to how you wish your site crawled. And yes, I think you are right.
    If you don't have it, it will attempt to crawl everything linked up.

    Is it needed? Hmmmm.

    It would sure make that error message go away, right?

    Paul
    Signature

    If you were disappointed in your results today, lower your standards tomorrow.

    {{ DiscussionBoard.errors[1742259].message }}

Trending Topics