5 replies
  • SEO
  • |
In 2011 I installed a sitemap.xml file for all of my subdomains and also the main domain.

Recently, my site went down for a day or two because of a network DDoS attack. Google sent me notices that GoogleBot could not access my site and I should fix it.

So tonight I reinstalled sitemap.xml for all the subdomains again. My questions:

I use a free service to create the sitemap.xml files and several of my subdomains have more than the 500 index upper limit. Does this affect the indexing.

And two, should I simply use the Robots.txt file and allow google to index without the sitemap.xml files?

#issues #sitemap #sitemapxml
  • Profile picture of the author Mr Lim
    Sitemap.xml built for the ease of indexing, so question 2 is not necessary.

    Why would a free service to create sitemap.xml has a index limit?

    What "free service" are you using?
    {{ DiscussionBoard.errors[8872089].message }}
  • Profile picture of the author promo87
    Yes, it may effect indexing !! you have used free service that creates site map for you but again in order to get more accurate result you need to change it accordingly !! And yes sitemap is used for the ease of the search engine so, there is no point of using Robot.txt file in order to allow indexing.
    {{ DiscussionBoard.errors[8872611].message }}
  • Profile picture of the author anoopsparx
    you should submit the sitemap file in google webmaster and try to cover all important pages that need to index in google
    {{ DiscussionBoard.errors[8872745].message }}
  • Profile picture of the author Mr Lim
    Since you're using wordpress just search for a sitemap.xml pluggin, they are free and no index limit.
    {{ DiscussionBoard.errors[8873024].message }}

Trending Topics