2 replies
  • SEO
  • |
So I've just submitted a sitemap to google but am getting warning messages that read:
Url blocked by robots.txt.
Sitemap contains urls which are blocked by robots.txt.
This is impossible because I no longer even have a robots.txt file.
Up until last week I was blocking Googlebot but I had not submitted a sitemap at that point. This is very strange. Anybody hae any ideas what is going on here? THX.
#googlebot
  • Profile picture of the author paulgl
    You don't need a robots.txt file, nor a sitemap.

    Delete them both, done and done.

    Why would you ever block google?

    Google has more than one bot. Not
    just one. They use data from the most recent one.
    They may be using "old" data.

    On a side note, that's what great about adsense.
    You get the adsense bot to visit your site.

    In fact, people who say you can't get indexed without
    backlinks are not 100% correct. Search results may
    may appear that have been gotten from the adsensebot!

    This happens frequently if one of the regular googlebots
    have not crawled your site yet or recently

    Paul
    Signature

    If you were disappointed in your results today, lower your standards tomorrow.

    {{ DiscussionBoard.errors[6458236].message }}
    • Profile picture of the author lutherlars
      Hey, thanks for reply:
      Your comment:
      Google has more than one bot. Not just one. They use data from the most recent one," would that apply to fresh sitemaps just submitted though?
      {{ DiscussionBoard.errors[6458257].message }}

Trending Topics