Need only tens of pages to be indexed out of hundreds: Robots.txt is Okay to proceed with?

8 replies
  • SEO
  • |
Hi all,

We 2 sub domains with hundreds of pages where we need only 50 pages to get indexed which are important. Unfortunately the CMS of these sub domains is very old and not supporting "noindex" tag to be deployed on page level. So we are planning to block the entire sites from robots.txt and allow the 50 pages needed. But we are not sure if this is the right approach as Google been suggesting to depend mostly on "noindex" than robots.txt. Please suggest whether we can proceed with robots.txt file.

Thanks
#hundreds #indexed #pages #proceed #robotstxt #tens
Avatar of Unregistered
  • Profile picture of the author dave_hermansen
    Is there some reason you don't want the other pages indexed? It's not like they are going to hurt your SEO if they ever get indexed and if they are that worthless, they probably won't get indexed anyway.

    And, of course, you could always switch to another CMS.
    Signature
    StoreCoach.com- Learn How to Dropship the Right Way - Buy & Sell Websites - Partner with Coach
    My PROVEN ecommerce process, as seen on: Fox News, the NY Times & Flippa
    {{ DiscussionBoard.errors[11484193].message }}
  • Profile picture of the author AlexKomkov
    Unfortunately, it's not gonna work. All other pages are going to appear as "indexed thought blocked robot txt" in your search console. I read Google think you made a mistake in your robot.txt or smth. I'm trying to experiment with "Noindex:" directive BUT it's not working for me.
    {{ DiscussionBoard.errors[11484246].message }}
  • Profile picture of the author wduarte24
    First of all, you may not need multiple pages for the same subject, you can add a canonical tag pointing to the main pages. Know that as more pages you have, more you share your link juice on and off page.

    Maybe it's time to switch your CMS plataform. But you probably can add a non index tag right now, it's just a code. Check this: https://support.google.com/webmaster...er/93710?hl=en
    {{ DiscussionBoard.errors[11484304].message }}
  • Profile picture of the author stephenchong
    Is It possible that you can shift to some new CMS because No index have better suggested by the search Engines. Blocking some of your pages in the Robots file is not an Issue but google will not crawl those pages But they remain in your backend. So I suggest you to upgrade your CMS or move to other platform.
    {{ DiscussionBoard.errors[11484336].message }}
  • Profile picture of the author steverobert
    May be there is some reason you don't want the other pages get indexed. But if you don't want and they ever get indexed it can cause issue in SEO. So upgrading the CMS can be helpful for nonindex.
    {{ DiscussionBoard.errors[11484385].message }}
  • Best would be to move to a CMS that can give you proper privileges to do so. Since blocking that many pages via robots.txt wont be a good idea but can be done.
    {{ DiscussionBoard.errors[11485077].message }}
  • Profile picture of the author ezrankings
    Noindex tag will be a good idea to block your pages but since your CMS is not supported Noindex tag, it's better to move on another CMS.
    {{ DiscussionBoard.errors[11485184].message }}
  • Profile picture of the author kuchenchef
    not a nice solution but according to this article from 2015: https://searchengineland.com/tested-...learned-220157 you could create that tag via javascript.
    {{ DiscussionBoard.errors[11486430].message }}
Avatar of Unregistered

Trending Topics