Best way to get Google crawl a URL previously 410
- SEO |
I recently had a major bug on my classifieds site which caused millions of URLs to be generated. These URLs were 'incorrect', often assigning the wrong category directory with an unrelated subcategory for example. Similarly, the site works using faceted navigation, and due to the vast amount of pages on the site I set a limit for googlebot to only be able to crawl a max of 3 at any one time, anything past this should not be crawlable.
Unfortunately there were some complications in the development stage and number of 'correct' URLs were assigned a 410 header response. This issue was fixed last week, via a combination of 200 and 310, but I am not seeing those 'Not Found' errors dropping in WMT. My worry is that Google has visited the URLs, seen the 410, so will not visit those URLs again. Obviously this is not good!
Due to the high volume it is not feasible to use the Fetch as Google function for all.
Any smart ideas or advice greatly appreciated. Thanks.
-
alexjames212 -
Thanks
SignatureFrench Bulldog Care{{ DiscussionBoard.errors[9549907].message }} -
-
mariajames1 -
Thanks
{{ DiscussionBoard.errors[9550340].message }} -
-
kishoreseo -
Thanks
{{ DiscussionBoard.errors[9550345].message }} -