But I have a bunch of new blogs created about a month ago, which Google is stubbornly refusing to index. Doing a "site:domain.com" search for any of them brings up zero results.
However, the weird thing is that the Googlebot is crawling all over them, and has been for some time. Some are getting crawled every few minutes. And some are even get search traffic from Google.
Yet the site: command still says, zip, yada, nothing...
What the heck does that mean?
I did wonder if they had been indexed and then quickly de-indexed for some accidental violation of Google's ToS. But as they are all good sites with unique content, I can't see why that should happen. And in any case, surely the Googlebot would not waste resources constantly crawling a de-indexed site?
Any ideas, anyone?