Need only tens of pages to be indexed out of hundreds: Robots.txt is Okay to proceed with?
- SEO |
We 2 sub domains with hundreds of pages where we need only 50 pages to get indexed which are important. Unfortunately the CMS of these sub domains is very old and not supporting "noindex" tag to be deployed on page level. So we are planning to block the entire sites from robots.txt and allow the 50 pages needed. But we are not sure if this is the right approach as Google been suggesting to depend mostly on "noindex" than robots.txt. Please suggest whether we can proceed with robots.txt file.
Thanks
We help sellers get the MAXIMUM amount for their websites and all buyers know that these sites are 100% vetted.
Grow your business with Digital Marketing Agency