How to manage a website divided in 3 subdomains with lot of bullshit pages ?

8 replies
  • SEO
  • |
Hi there,

I have a big issue. I start working in a new company and the on-page seo is a mess. They have 3 subdomains, with the site operator on each subdomain (site:subdomian.com), I found a total of around 66 K pages and a lot of them are crap.

The IT team will extract all those 66K pages indexed by Google, which is cool. But the question is: how can I clean up all this mess in an optimal way? please help
#bullshit #divided #lot #manage #pages #subdomains #website
Avatar of Unregistered
  • Profile picture of the author expmrb
    Search engines considers sub-domains as separate domains so you have to do SEO for each.
    Signature
    SEO Motionz Forum & Blog- Digital Marketing Forum & Blog,
    Forum Management & Promotion, SEO Tips, Money Making tips etc.
    {{ DiscussionBoard.errors[11309650].message }}
    • Profile picture of the author CarpeNoctem
      That isn't always true. It's a bit of a myth started by the emergence of websites like Wordpress.com, however Google uses an endless number of subdomains like maps.google.com and they borrow authority from the main domain.
      {{ DiscussionBoard.errors[11309667].message }}
  • Profile picture of the author CarpeNoctem
    It's impossible to say without knowing what you're looking at. You're asking us to make value judgments on 66,000 pages without knowing what they are or how important they are to your business.

    Ultimately, the only way to filter through the crap is to do a manual review of the content and 301 redirect everything you don't need.
    {{ DiscussionBoard.errors[11309668].message }}
    • Profile picture of the author muellerdave
      Thanks for your answer. First of all, I've already judged the 66 K pages and I know that a lot of them are crap. I was just wondering if I have to check manually all these pages...and seems like yes.

      So options here are: 301 redirection or no-index metatag right? is there any other way to make the process faster?
      {{ DiscussionBoard.errors[11309742].message }}
  • Profile picture of the author CarpeNoctem
    Not really, however it really depends on what you're looking at. When it comes to clean ups of this size, historically I've relied on finding patterns eg doing a full crawl of the website using Screaming Frog and sorting by duplicate meta.

    Dependent on what the IA is like, you can then modify .htaccess to target specific subdirectories rather than work on a page per page basis.

    I typically export all backlinks from Majestic to ensure I'm not deleting anything that looks crap but has value either.

    I'm always reluctant to just 'redirect everything' on a subdomain, particularly if I'm working with a large CMS, because media files aren't always located in the same place.

    That's about the best I can do for you without visibility of what you're actually doing.

    p.s. in answer to your other question, it may also be worth considering dynamically/conditionally inserting a canonical tag (if you don't want to go through the hassle of deleting/redirecting and you've got a lot of duplicate content). I'm saying that without looking at what the problems are though, so take it with a pinch of salt.

    p.s.s no-index/robots/canonical are merely suggestions, not directives. In other words, there's no guarantee Google will honor them (even though they usually do).
    {{ DiscussionBoard.errors[11309750].message }}
    • Profile picture of the author muellerdave
      Do you wanna check the URL to understand better my issue? if yes, I write you a private message
      {{ DiscussionBoard.errors[11309784].message }}
      • Profile picture of the author CarpeNoctem
        Not really, that sounds like something I get paid for/do most days of my working life.

        Happy to provide guidance in the thread though.
        {{ DiscussionBoard.errors[11309801].message }}
Avatar of Unregistered

Trending Topics