Help indexing all my pages...

7 replies
  • SEO
  • |
Hi Guys & Gals

My site seems to have less than 75% index 211 out of 361 pages


I done all the usual stuff -->
>Google Webmasters Account,
>Verify The Account
>Added Site maps , xml, txt, & Html and submitted.
>all pages are linkable

Any more tips.... site is in my signuture
#indexing #pages
  • Profile picture of the author stevejeff
    Hi,
    Target the pages which are not indexed. Re-work on your sitemaps,navigational links. Participate in social bookmarking/forums/blogs for the pages which are not indexed. Add fresh contents!
    {{ DiscussionBoard.errors[721064].message }}
  • Profile picture of the author obj111
    Build links to individual pages.
    {{ DiscussionBoard.errors[721144].message }}
  • Profile picture of the author scene4u
    You could submit an XML sitemap to Google webmaster tools, that normally allows all your pages to be indexed.
    Signature

    Psychic Readings http://www.kooma.co.uk
    Search engine optimisation Londonhttp://www.searchsensations.com
    Abs Workout Now http://www.absworkoutnow.co.uk

    {{ DiscussionBoard.errors[721653].message }}
  • Profile picture of the author ehicks727
    It takes time. The way you can ensure your site will be indexed is to submit a sitemap, make sure you don't have duplicate content, pages that you want to get indexed quickly, link to them from your home page, and then just wait.

    I'm waiting on Google to index a 600 page site right now, it's taken over a month and I'm only up too 400 or so... This is what I've observed... Google will grab a large chunk all at once, and then leave two-thirds of your site un-indexed. Then what I do is link to about 20 pages from my home page and I monitor them. On this particular site, Google is indexing about 15 new pages a day (its a brand new domain). As soon as the pages get indexed, I rotate the home page link list and put new un-indexed pages in the list. You can actually use several pages to do this trick. As long as you get a page indexed, and then check to see how often Google visits that page in your logs, then you can set up strategic pages to be your "link to deep content" pages. The trick is to make sure Google is visiting these pages daily and is re-indexing that page every time the Googlebot visits. You just have to keep a diligent eye on your logs.

    The strategy to getting large sites indexed is INTERNAL LINKING.
    {{ DiscussionBoard.errors[722106].message }}
  • Profile picture of the author seach4s
    not its going down.... 196 out of 361 pages....
    {{ DiscussionBoard.errors[729908].message }}
    • Profile picture of the author Brawnydt
      Submit the unindexed pages to a social bookmarking site. Give a few to digg, a few to mixx, a few to mister-wong, a few to folkd and so forth. This will get all those pages indexed in 48 hours or less usually.

      You can also improve your internal linking structure so the spider will find those pages easier.

      Woohoo! 100th Post. Just noticed.
      {{ DiscussionBoard.errors[730238].message }}
  • Profile picture of the author spanisheye
    When you do a site search on Google and go the very last page you can you get this message "In order to show you the most relevant results, we have omitted some entries very similar to the 167 already displayed."

    There's your answer. Basically Google is finding very little difference between some of your pages and therefore doesn't index them as they are not unique enough. You need to make sure all your pages have unique titles, descriptions and body text. Just changing a few words doesn't count.

    Time to get writing me thinks...
    Signature
    {{ DiscussionBoard.errors[730910].message }}

Trending Topics