Googles Algorithm Exposed - How Google Decides What Position To Rank Your Site!

by 16 replies
20
Hey. It may not surprise you that Google has hundreds of variables
when it comes to ranking. I.e. what it looks at to determine where
you should rank. Obviously Google are the only people that know
exactly what it takes to rank. Below are the good things and bad
things I believe Google looks at to determine where to place your
site in the serps based on my experience as an SEO. I'm sure you
can work out which ones Google wouldn't like. Please feel free to chime
in if you can think of anymore.
So these are the things I can think of that Google 'probably' looks
at when deciding where you should rank. I expect this list is pretty
close to the truth although I'm sure there are probably hundreds
more factors. Again feel free to chime in if you can think of any more!

Will
#search engine optimization #algorithm #exposed #google algorhithm #googles #rank
  • Probably one of the most important keys...
  • Wow. Thanks Will. You really know your stuff when it comes to SEO. I would have never of thought of all of those and I considermyself pretty good at SEO.

    I hope people realise what a great post this is because i've seen alot of people asking what they need to be doing to get results on SEO but this list pretty much shows them all they need to know.
    • [1] reply
    • Yes I agree that bounce rate of the page is having great important! One of the useful and nicely orgazined post. Thanks a lot!!
  • Just a few then!! That's a lot of things to take into consideration, also it seems that results can vary a bit through out the day as well so wonder if there is any "randonmness" or "fuzziness" built into it somewhere.
  • A good list with some key points, but I don't think the following will effect the SERPs or would even be something Google would check:

    With so much virtual hosting these days, I doubt that this could affect your site negatively, unless it was directly related to any bad sites on the same IP

    I don't think Google can possible be as subjective as to decide a website is "good" or not.

    Static pages that sit there not being updated can rank just as well as pages with constant updating.

    I don't think this kind of content should effect your rankings.

    Google have stated that a site doesn't have to be standards compliant, and I saw Matt Cutts talking saying that they don't even try and verify the code on a page.

    That said, Hobo Web did a great little validation test here and it seems that whilst there were some flaws in their method, Google "preferred" their valid page.

    I don't think Google looks at this kind of thing, not when it comes to it's search results anyhow - there's plenty of sites that people would only need to be on for a few seconds and they still rank fine.

    I really doubt if Google would use this to effect SERPs... although it would make sense for sites with a lower bounce rate to rank higher, obviously the content would be better... but I've had sites with BR of 50% and higher rank well... anyone know anything about this?

    This is checked but having duplicate content won't get you penalised, it's a popular misconception, but all Google does is index one instance of the duplicate content.
    • [ 1 ] Thanks
    • [1] reply
    • @edgray thanks for your reply.



      Actually I'm pretty sure Google does look at the other sites
      on the same IP address, or if you don't have who is protection
      they look at your details and see the other sites you own. This
      is one of the reasons why Google became a registrar. By looking
      at the other sites you own they can get an idea of what it is your
      up to. It may not be a huge factor but one I'm sure they look at.



      "Good" probably wasn't the best word for me to have used but essentially
      they can determine whether a site or page linking to you is "good" or
      not. I probably covered "good" under PR of page and domain authority
      but Google would give much more weight to a link from wikipedia than
      they would a link from some pligg or scuttle site. Therefore they can
      decide whether a link is "good" or not.



      That's right, I never said they didn't. In fact I'd say that the editing of
      content is actually a bad thing to do because when Google's given
      you a place and you edit your content they then have to rethink your
      ranking so editing content isn't necessarily good. However updating
      your site with fresh content can be a good thing for some keywords.

      Some keywords merit fresh content and some don't. Not a lot of people
      know this but Google treats every keyword differently. I.e they look
      for different things to show their searchers based on what the searcher
      is looking for so updated can be a good thing and in some cases can
      ruin a ranking.



      Actually it will affect your rankings. Again Google treats every
      keyword differently so the type of content on your site can
      have a dramatic effect on your ranking. Google's very clever and
      is only getting better at finding out what the searcher wants and
      then giving it to them. So if you can give Google the type of content
      it wants then it can affect your rankings.



      If the code is full of errors and the Google bots can't read it then
      how are you going to get a good ranking? Reading and verifying are
      two totally different things. They can't rank what they can't read.
      Enough said.



      Again this may not be a huge factor. But is something they definitely look
      at. I believe Google can actually estimate by the amount of content
      on the page what the average user time on the site should be so if your
      bounce rate and user time on page is like 3 seconds and 90% then
      Google's going to realize somethings not right. Again it may not be
      a huge factor but definitely a factor none the less.



      This is mostly true however Google will often times index all instances of the
      dupe content but will only show one instance of that content on the
      first page. However if the dupe content is on the same site then often
      times it will only index the page that has the most backlinks and
      provides the best user experience.

      Will
      • [ 1 ] Thanks
      • [1] reply
  • This is a pretty solid list. I am sooo glad you included " because this is, imho, one of the most important aspects of a backlink. It's why I believe Blog Commenting to be one of the most beneficial SEO tasks in my workday.

    I am also glad to see you added "Bounce Rate" & "Avg. Time Spent On Site" as I believe these are important factors. Google Analytics tracks it, so you know that Google takes these into consideration.

    The only factor I might add to the list would be "Number of Pages Viewed Per Visit." This goes hand in hand with Bounce Rate & Avg. Time Spent On Site, but is another indicator that the entire site is more authoritative rather than just one particular page.
    • [1] reply
    • Totally unimportant because they can't be measured accurately or consistantly, even if the site has Google Analytics on it.

      What really matters are the quantity and quality of incoming links and how well the anchor text of those links and the content on the site relate to the search term. The rest matters very little.
  • Banned
    [DELETED]
  • Thats a nice list.A thorough compilation of what needs to be done.

    But,i don't think that matches with the Title of the Thread.Those tactics have been in the industry for quite a while.So,i don't think anything has been exposed.

    Its great you have listed some overlooked SEO tasks as well.
  • Code:
    Actually I'm pretty sure Google does look at the other sites
    on the same IP address, or if you don't have who is protection
    they look at your details and see the other sites you own. This
    is one of the reasons why Google became a registrar. By looking
    at the other sites you own they can get an idea of what it is your
    up to. It may not be a huge factor but one I'm sure they look at.
    I think this plays a huge factor. Think about it, they find a web master who has 50 domains in the same niche that are all linked from the same one or two hosting accounts. I think G could consider this as spam.
  • I'd say LSI keywords as well, I've had plenty of sites that were optimized with LSI keywords jump up in rankings
  • SEO changes so fast it's hard to keep up with what the rules will be tomorrow!
  • "Relevancy of page's content where your backlink resides" I really don't think this has relevance. If someone here has information about this subject, I would be very interested in reading it :-)
  • oh, i'll have to sweat to check up all these.

Next Topics on Trending Feed