robots.txt update clarification
- SEO |
My question is this: If I disallow a path where an EXISTING google serp has been indexed, this does NOT automatically remove the current SERP, correct? That SERP would eventually go away when bots stop crawling the page but only after the new url has been crawled? In other words, will google see a disallowed path and say, 'go and immediately remove all these urls from our index'? I want the old pages/urls to drop off gradually as they get replaced by the new ones. Should I wait to disallow an old path for a while until the new paths are in place? Or does robots.txt affect only the CRAWL?
Free (good quality) Self Help Resources
Using Mindfulness to Quiet Your Mind