Removing robots.txt 'disallow' for a Large Number of Pages
- SEO |
However, this is being resolved now and I was wondering, are there any issues in removing all these pages from robots.txt at once? Will Google see this as too many pages being pumped out at once? Or does Google already acknowledge the existence of the pages and knows that they are a normal part of the site?