Removing robots.txt 'disallow' for a Large Number of Pages

0 replies
  • SEO
  • |
I have a large e-commerce website (StyleTread.com.au) and most subdirectories and pages are disallowed in robots.txt. This was done as the site architecture had to be changed and we didn't want Google caching pages due to duplicate content issues.

However, this is being resolved now and I was wondering, are there any issues in removing all these pages from robots.txt at once? Will Google see this as too many pages being pumped out at once? Or does Google already acknowledge the existence of the pages and knows that they are a normal part of the site?
#disallow #large #number #pages #removing #robotstxt

Trending Topics