Why Is Site Not Fully Indexed and SERPs Wildly Inconsistent?

6 replies
  • SEO
  • |
I have a 2400 page site built with all unique articles ranging from 500w- 4000w. Everything is as it should be in google webmaster tools and GWT says all the submitted pages are indexed. However, when I do a "site:domain.com" search on Google, Google sometimes says there are 2,600 results, sometimes 1,500 results, sometimes, 3,000 results, sometimes 200 results, sometimes 2,400 results and everything in between. It almost changes by the hour. So even though doing the "site:" search might say I have 2,400 pages indexed, when I actually click to see the different SERP, it only shows 180 results.

So the "site:domain.com" search varies hourly in the number of indexed pages I have ranging from 7 to 2500, but the actual number of ranked (pages actually in the SERPs) never exceeds 180, even if Google tells me I have 2000 or 2500 pages indexed, the number of clickable pages shown never exceeds 180.

The site has been up for 3 months. Does any SEO guru here have any idea what is going on? I really am thinking of ditching the domain but fear maybe I the same thing might happen on a new domain. Any advise will be greatly appreciated. Thank you.
#fully #inconsistent #indexed #serps #site #wildly
  • Profile picture of the author SEO Power
    The 'number of results' displayed at the top of the results pages isn't accurate. You have to click to the last page of the results to see the actual number of results for your search.

    180 out of 2,400 pages in 3 months is indicative of an extremely low crawl rate. Proper internal linking and building backlinks to your internal pages can expedite the indexing process for the 2,220 pages that haven't been indexed yet.
    {{ DiscussionBoard.errors[9452049].message }}
    • Profile picture of the author apollocreed
      Originally Posted by SEO Power View Post

      The 'number of results' displayed at the top of the results pages isn't accurate. You have to click to the last page of the results to see the actual number of results for your search.

      180 out of 2,400 pages in 3 months is indicative of an extremely low crawl rate. Proper internal linking and building backlinks to your internal pages can expedite the indexing process for the 2,220 pages that haven't been indexed yet.
      Thanks for that I did not know that the last page reveals the "real" number of results as it has never happened to me for a site. However I have been clicking the last page and that is how I get the 180 results out of 2,400 pages.

      GWT says 2,400 pages have been indexed, so I am lost why only 180 get showed. The site is silo built with "excellent" internal linking. I am puzzled though as to why the number of results on the pages apart from the last keeps changing so widely. I have not seen this before for a site and perhaps it is indicative of something.
      {{ DiscussionBoard.errors[9452104].message }}
  • Profile picture of the author Icematikx
    have to agree with SEO Power. Is there any clear internal linking structure in place? If you're using /page/4 archives, going back through thousands of articles, then chances are Google simply isn't crawling them.

    Get a sitemap up, and get it silo structured if need be. Google won't navigate through 3,000+ articles on a sitemap. Break the sitemap into categories etc (I'm sure a plugin will do this for you).

    Use a plugin to start an internal linking structure. Plugins can automatically link say, "Home Insurance" to an article. You need to get pagerank flowing through the site. Writing so many articles on a new domain will never get you anywhere - because the pages that the articles sit on are worthless.

    And your main focal point has to be on getting some quality external links. If you have practically no links, then there's no pagerank flowing through the site at all. Adding 100's of articles a week isn't going to bank you money. You need a mixture of offsite promotion and content creation.

    A 1-page website with 100 external links will rank better than a 200-page website with 2 external links (if all of the links contribute the same amount of "juice".
    Signature

    Just got back from a #BrightonSEO. I was given room 404 in the hotel I stayed at. Couldn’t find it anywhere!

    {{ DiscussionBoard.errors[9452062].message }}
    • Profile picture of the author apollocreed
      Originally Posted by Icematikx View Post

      have to agree with SEO Power. Is there any clear internal linking structure in place? If you're using /page/4 archives, going back through thousands of articles, then chances are Google simply isn't crawling them.

      Get a sitemap up, and get it silo structured if need be. Google won't navigate through 3,000+ articles on a sitemap. Break the sitemap into categories etc (I'm sure a plugin will do this for you).

      Use a plugin to start an internal linking structure. Plugins can automatically link say, "Home Insurance" to an article. You need to get pagerank flowing through the site. Writing so many articles on a new domain will never get you anywhere - because the pages that the articles sit on are worthless.

      And your main focal point has to be on getting some quality external links. If you have practically no links, then there's no pagerank flowing through the site at all. Adding 100's of articles a week isn't going to bank you money. You need a mixture of offsite promotion and content creation.

      A 1-page website with 100 external links will rank better than a 200-page website with 2 external links (if all of the links contribute the same amount of "juice".
      Thank you for your great points. My plan was to get the site indexed and slowly increase quality links to it. The sitemap is already in GWT and hence why it tells me all 2400 pages are indexed. The domain used was an expired domain (real PR 5/DA 62) with links from the UN, WHO, UNESCO, etc. and many real .gov and .edu sites. No spammy links on it and checked on wayback, ahrefs etc. So all I wanted first was to get in totally indexed and showing in the SERPs and than increase the links to it.

      I have gradually thrown a few powerful (20) internal links to it to get it started plus a press release.
      {{ DiscussionBoard.errors[9452144].message }}
  • Profile picture of the author yukon
    Banned
    There's more than one possibility, one could be the WMT number isn't counting pages Google buried in Supplemental SERPs. Maybe your unintentionally blocking the pages on the site?
    {{ DiscussionBoard.errors[9452070].message }}
    • Profile picture of the author apollocreed
      Originally Posted by yukon View Post

      There's more than one possibility, one could be the WMT number isn't counting pages Google buried in Supplemental SERPs. Maybe your unintentionally blocking the pages on the site?
      Thanks. I am blocking about 400 pages which are just some code which some showed up in the SERPs. I have triple checked my robots.txt and the configuration there is correct. Could it be that GG does not care about robot.txt configurations? maybe I should unblock them?
      {{ DiscussionBoard.errors[9452168].message }}

Trending Topics