Recently, my site went down for a day or two because of a network DDoS attack. Google sent me notices that GoogleBot could not access my site and I should fix it.
So tonight I reinstalled sitemap.xml for all the subdomains again. My questions:
I use a free service to create the sitemap.xml files and several of my subdomains have more than the 500 index upper limit. Does this affect the indexing.
And two, should I simply use the Robots.txt file and allow google to index without the sitemap.xml files?