The NEW number 1 aspect to getting top ranked in search engines.

64 replies
  • SEO
  • |
Okay, it's not new at all. It's more just getting more and more recognition as time goes by. It's not keyword placement in any form. It's not the amount or even the quality of backlinks to your page. It's....the quality of the content.

You have to look at what Google actually is. They are a service that provides their users with answers to their questions. Google wants more then anything to provide their users with the best quality content to their answers.

Now, the internet is so large, they can't do this with actual people. They use an algorithm and bots to do the major legwork to find the best quality results. In the past, the factors have been keyword usage, backlinks, etc. That's because the algorithm made it so.

As time progresses though, so does the algorithm. Things like keyword usage and backlinks can be manipulated and taken advantage of. You can never fake the quality of the content.

As google's algorithm for recognizing quality progresses, so should the quality of what you produce. One of the best aspects of getting ranked highly in search engines is to provide something better then the already ranked opponents.

Travis
#aspect #engines #number #ranked #search #top
  • Profile picture of the author jasonmorgan
    As google's algorithm for recognizing quality progresses, so should the quality of what you produce. One of the best aspects of getting ranked highly in search engines is to provide something better then the already ranked opponents.
    And you can back up this claim?

    Define quality content.

    How does google determine what is quality content?

    Sorry, there is nothing wrong with providing quality content but you're theory is based on wishful thinking and hot air.
    Signature

    I'm all about that bass.

    {{ DiscussionBoard.errors[2128119].message }}
    • Profile picture of the author petelta
      Originally Posted by jasonmorgan View Post

      Sorry, there is nothing wrong with providing quality content but you're theory is based on wishful thinking and hot air.
      Not at all. I don't just come up with stuff on the top of my head. It's from results. Quality content is providing the best answer to the questions being searched.

      It's of course only one aspect of getting ranked. You have to backlink, you have to have SEO, but you also have to provide good stuff.

      I get this from the many sites I run and work on rankings for each day. At one time all you had to do was have keywords in teh right places. Then after that evolved, you just had to backlink the crap out of your site. Now evolution is moving forward, you have to provide more quality.

      Don't think Google isn't trying to provide the best for their users. That's what their business is about. You have to give them the quality to rank.

      It's in no way hot air. I have done this for sometime now.
      Signature
      TEESPRING Student Rakes In Over $116k In Less Than 3 Months
      Niche Pro Profits - How I raked in OVER $120k in 9 months with authority niche sites...

      {{ DiscussionBoard.errors[2128142].message }}
      • Profile picture of the author Titans
        Originally Posted by petelta View Post

        Okay, it's not new at all. It's more just getting more and more recognition as time goes by. It's not keyword placement in any form. It's not the amount or even the quality of backlinks to your page. It's....the quality of the content.
        Wrong

        Every 2 weeks someone on this forum will come up with a "NEW" idea that ends up being something that was already debunked years ago, so here it is again:

        Your theory on "quality" is fundamentally flawed because:

        1. The "Quality" you used here is subjective, a standard that varies from human to human.
        2. Binary computer programs (Google) use absolute 0s and 1s and cannot determine this "quality" without using other objective data and formula.
        3. At which point, the ranking is based on a bunch of maths and numbers and no longer has anything to do with "Quality".
        4. "Quality" is the final answer the human brain give themselves, but not a computing factor, so you've got it backwards.


        Originally Posted by petelta View Post

        You can never fake the quality of the content.
        Wrong, why do you think they stopped taking meta keywords into account?


        Originally Posted by petelta View Post

        As google's algorithm for recognizing quality progresses, so should the quality of what you produce. One of the best aspects of getting ranked highly in search engines is to provide something better then the already ranked opponents.
        What the hell is "better"?

        Originally Posted by petelta View Post

        It's of course only one aspect of getting ranked. You have to backlink, you have to have SEO, but you also have to provide good stuff.
        What the hell is "good"?

        Originally Posted by petelta View Post

        Now evolution is moving forward, you have to provide more quality.
        What the hell is "quality"?

        Originally Posted by petelta View Post

        You have to give them the quality to rank.

        It's in no way hot air. I have done this for sometime now.
        Again, if it is not hot air, then what the hell is "quality"?

        Your theory falls flat on its face in the real world in the following examples:

        1. Human don't really want "Quality", the most popular food in the world are junk food made with harmful chemicals with zero nutritions.
        2. The Toyota Hilux "Indestructible" and the Lamborghini Murcielago are both "Quality" cars, but A LOT more people want the Lamborghini. Lamborghini ranked higher in the human mind not because of "Quality" but because of the experience and social status you get with it.
        3. The most popular people on earth are well known whores and not "Good" (Quality) women.


        Originally Posted by petelta View Post

        The whole point of their algorithm is to deliver quality to the searcher. They are going to continue to find variables and factors that determine "better" quality.
        Your theory fall flat on the face in the business world because:

        1. Google is a business and the whole point of their algorithm is to keep Google in business.
        2. Which means showing what most people want in the top 10 results so they don't flock to competitors like Yahoo or Bing.
        3. When people see what they want, they think the quality is good, when people see what they don't want, they think the quality is bad. Quality is the final human answer, not a computing factor.
        4. So what it all comes down to, just like everything else in the market, is popularity.
        5. Popularity > Quality. That's why voting (backlink) is so important in Google's algorithm. What's next is determine how real that vote is, then add the personal bias factors. This is where Google is today.


        Originally Posted by petelta View Post

        The point of the whole post is that quality is a growing factor in how well you are getting ranked. at this point and time you are going to need all aspects, BECAUSE that's what the algorithm calls for. But the algorithm changes.
        No all you did really is just posting that you think you can produce good quality.

        Google has already realized there is no absolute way of determining what everyone considers good quality at the same time that's why they moved onto personalized results.

        Ask the same women what she see as "Quality Man" when she is in her teens/twenties/thirties/fourties and you'll get different answers.

        This is what Google is dealing with, and if you already know everything is dynamic then why waste time focus on a useless word call "Quality".

        "Quality" is just another word for "Good".

        All you said in this thread is basically "Good content rank better", which sounds good in theory for amateurs but 100% useless in practice.


        Originally Posted by petelta View Post

        I have no idea what they want. Maybe relevant keyword usage, sentence composition, grade level to understand the material. I don't know.
        This should be the title of your thread, "I don't know what Google want" instead of "The NEW number 1 aspect is x".

        "Quality" by itself is simply irrelevant. It is used when people without inside knowledge looking from the outside. When you see people use "Quality" by it self, that manes that person knows what he/she wants but has absolutely no idea what's going on under the hood.
        {{ DiscussionBoard.errors[2129588].message }}
    • Profile picture of the author solton
      Originally Posted by jasonmorgan View Post

      And you can back up this claim?

      Define quality content.

      How does google determine what is quality content?

      Sorry, there is nothing wrong with providing quality content but you're theory is based on wishful thinking and hot air.
      Hey Guys,

      I don't think he's that off the mark.

      It may not be a reality today, but it is quickly moving that way.

      An example is article spinning. Googles algorithms can now spot spun articles with relative ease (and I don't care how well you spin them).

      It's a game that we play with search engines and in the end the search engines will win.

      Sometime in the near future, I believe, it will be the sites with the best content that will win out.

      Best,

      Scott
      {{ DiscussionBoard.errors[2131353].message }}
      • Profile picture of the author bgmacaw
        Originally Posted by solton View Post

        An example is article spinning. Googles algorithms can now spot spun articles with relative ease (and I don't care how well you spin them).
        You better watch out
        You better not spam
        Better not cloak
        I'm telling you why
        Google Claus is coming to town
        They're making a list
        And checking it twice;
        Gonna find out who's naughty and nice
        Google Claus is coming to town
        They see you when you're sleeping
        They know when you're awake
        They know if you've been bad or good
        So be good for goodness sake!
        Oh! You better watch out!
        You better not spam
        Better not cloak
        I'm telling you why
        Google Claus is coming to town




        Google's not Santa Claus, God, Zeus, etc, etc. It's a freakin' computer algorithm. Stop anthropomorphizing it and giving it supernatural powers like you're living in a cargo cult tribe or a 3 year old visiting a mall Santa for the first time.

        :rolleyes:
        {{ DiscussionBoard.errors[2131513].message }}
    • Profile picture of the author Adam Roy
      Originally Posted by jasonmorgan View Post

      And you can back up this claim?

      Define quality content.

      How does google determine what is quality content?

      Sorry, there is nothing wrong with providing quality content but you're theory is based on wishful thinking and hot air.
      I HIGHLY DISAGREE with Jason Morgan. This thread is right on the money. With more experience in IM you will realize the person who started this thread is correct.

      PETELTA IS 100% CORRECT WITH THIS THREAD.

      And I'll PROVE IT right now! Well you can prove it to yourself just read on.

      How many times have you outranked an article? And why?

      Articles have more backlinks, higher page ranks, older domain names, more pages, more content. So how do we outrank them so easy?

      Because the QUALITY of the content on our website in this PARTICULAR subject/keyword is better than what's inside the article.

      How many times have you outranked a forum in the serps?
      Forums have higher page ranks, more pages, older domain names, more backlinks, etc. So how do we outrank them so easy?

      Because the QUALITY of our content is better than theirs.

      Is this not enough proof for you? Think outside the box a bit. If it were backlinks, page rank, domain age etc that were the most important for ranking a website, you would NEVER outrank an article or forum or directory or anything else.

      You probably have very well outranked websites with nothing more than better content.

      Go to one of your websites in the SERPS.

      I can guarantee you, that hundreds of websites ranked lower than you throughout the many pages of results will have a higher page rank, more backlinks, more pages, older domain ages, etc.

      How did you outrank all of the sites below yours after your innitial indexing? It's not because you had more backlinks and whatnot....think about it.
      {{ DiscussionBoard.errors[2131618].message }}
      • Profile picture of the author Tom Goodwin
        Originally Posted by friend View Post

        I HIGHLY DISAGREE with Jason Morgan. This thread is right on the money. With more experience in IM you will realize the person who started this thread is correct.
        Nice dig there. I'm sure many of us have more experience in IM than you do, and many of us strongly disagree with you and the OP.

        Originally Posted by friend View Post

        And I'll PROVE IT right now! Well you can prove it to yourself just read on.

        How many times have you outranked an article? And why?

        Articles have more backlinks, higher page ranks, older domain names, more pages, more content. So how do we outrank them so easy?
        Pages are ranked in Google, NOT domains. typically, the articles themselves will not have more backlinks or higher page ranks (which btw, is irrelevant for ranking in Google). You are comparing apples and oranges here.

        Originally Posted by friend View Post

        Because the QUALITY of the content on our website in this PARTICULAR subject/keyword is better than what's inside the article.
        It has little to do with "quality" content. If you have good on-page SEO in terms of keyword in title, keyword density, keyword in header tags, etc., the content itself can be be 100% crap and you can rank well.


        Originally Posted by friend View Post

        How many times have you outranked a forum in the serps?
        Forums have higher page ranks, more pages, older domain names, more backlinks, etc. So how do we outrank them so easy?
        Again, pages rank in Google NOT domains. People on here get confused about SEO after reading comments like that.



        Originally Posted by friend View Post

        Because the QUALITY of our content is better than theirs.
        No, it would be because the on-page SEO is better more than likely. How many forum posts/pages are properly SEO'd?


        Originally Posted by friend View Post


        Is this not enough proof for you? Think outside the box a bit. If it were backlinks, page rank, domain age etc that were the most important for ranking a website, you would NEVER outrank an article or forum or directory or anything else.
        Sigh. here we go again. There is no doubt that in order to rank well (in most circumstances), the site has to to be deemed relevant by Google on that topic, based upon on-page SEO factors. But, throwing the intended keyword around a few times in strategic places does not mean its a good, quality product. I can put up pig-latin and if the proper SEO is there it will rank for the keyword with backlinks.

        This is the type of thread that is bringing this place down to a level with DP.
        {{ DiscussionBoard.errors[2131670].message }}
        • Profile picture of the author Adam Roy
          Originally Posted by Tom Goodwin View Post

          Nice dig there. I'm sure many of us have more experience in IM than you do, and many of us strongly disagree with you and the OP.



          Pages are ranked in Google, NOT domains. typically, the articles themselves will not have more backlinks or higher page ranks (which btw, is irrelevant for ranking in Google). You are comparing apples and oranges here.



          It has little to do with "quality" content. If you have good on-page SEO in terms of keyword in title, keyword density, keyword in header tags, etc., the content itself can be be 100% crap and you can rank well.




          Again, pages rank in Google NOT domains. People on here get confused about SEO after reading comments like that.





          No, it would be because the on-page SEO is better more than likely. How many forum posts/pages are properly SEO'd?




          Sigh. here we go again. There is no doubt that in order to rank well (in most circumstances), the site has to to be deemed relevant by Google on that topic, based upon on-page SEO factors. But, throwing the intended keyword around a few times in strategic places does not mean its a good, quality product. I can put up pig-latin and if the proper SEO is there it will rank for the keyword with backlinks.

          This is the type of thread that is bringing this place down to a level with DP.
          Everyone is making it seem as though this thread is saying that "quality content is all you have to do to get good rankings"

          I don't believe that is what is being said here. The point being made is that QUALITY content is becoming immensely more important than it used to be.

          The QUALITY AND UNIQUENESS of the content has an impact much larger than it used to.

          Obviously automated MASS link building, simple profile links, etc are noticed by the people who're in charge of the SE'S algorithms.

          Therefore the algorithms change giving more credit to the actual content of a site than it used to.

          And as far as the outranking an article comment, I have outranked articles which DO HAVE backlinks to the particular article, not just the directory.

          Therefore in those particular situations, the ONE THING that ranked me higher was nothing more than the content of my site.
          {{ DiscussionBoard.errors[2131717].message }}
          • Profile picture of the author Tom Goodwin
            Originally Posted by friend View Post

            Everyone is making it seem as though this thread is saying that "quality content is all you have to do to get good rankings"

            I don't believe that is what is being said here. The point being made is that QUALITY content is becoming immensely more important than it used to be.

            The QUALITY AND UNIQUENESS of the content has an impact much larger than it used to.

            Obviously automated MASS link building, simple profile links, etc are noticed by the people who're in charge of the SE'S algorithms.

            Therefore the algorithms change giving more credit to the actual content of a site than it used to.

            And as far as the outranking an article comment, I have outranked articles which DO HAVE backlinks to the particular article, not just the directory.

            Therefore in those particular situations, the ONE THING that ranked me higher was nothing more than the content of my site.
            As Jason Morgan has already mentioned, we would need to know more things like how many backlinks your site page has, etc.

            In any event, the OP was talking about the "quality" of content. All you are saying is that on-page SEO affects rankings, and that is nothing new and what virtually everyone preaches. But, good on-page SEO does not mean that it is quality content in any sense of the word.
            {{ DiscussionBoard.errors[2131798].message }}
            • Profile picture of the author Adam Roy
              Originally Posted by Tom Goodwin View Post

              As Jason Morgan has already mentioned, we would need to know more things like how many backlinks your site page has, etc.

              In any event, the OP was talking about the "quality" of content. All you are saying is that on-page SEO affects rankings, and that is nothing new and what virtually everyone preaches. But, good on-page SEO does not mean that it is quality content in any sense of the word.
              ON PAGE SEO is only the start to what's being called "quality content"

              Quality content also includes the following things, and YES it does have an impact on however Google sees your centent.

              • SPELLING
              • GRAMMAR
              • PARAGRAPH STRUCTURE
              • LANGUAGE
              • UNIQUE CONTENT
              • EASY NAVIGATION
              • IMAGES LABELED WITH RELEVANT AND DESCRIPTIVE TEXT IN ALT/DESCRTIPTION FIELDS
              • AND PROBABLY MORE
              The simple placement of keywords and LSI's throughout a site is no longer the only main factor in on page SEO to determind "quality content".

              So by quality content, I'm not only talking about keywords, LSI's and unique content, but all of the above.
              {{ DiscussionBoard.errors[2131836].message }}
              • Profile picture of the author Jacob Martus
                Originally Posted by friend View Post

                ON PAGE SEO is only the start to what's being called "quality content"

                Quality content also includes the following things, and YES it does have an impact on however Google sees your centent.

                • SPELLING
                • GRAMMAR
                • PARAGRAPH STRUCTURE
                • LANGUAGE
                • UNIQUE CONTENT
                • EASY NAVIGATION
                • IMAGES LABELED WITH RELEVANT AND DESCRIPTIVE TEXT IN ALT/DESCRTIPTION FIELDS
                • AND PROBABLY MORE
                The simple placement of keywords and LSI's throughout a site is no longer the only main factor in on page SEO to determind "quality content".

                So by quality content, I'm not only talking about keywords, LSI's and unique content, but all of the above.
                I can't write a complete **** piece of content and satisfy everyone of those things.
                {{ DiscussionBoard.errors[2133051].message }}
  • Profile picture of the author bgmacaw
    Originally Posted by petelta View Post

    You can never fake the quality of the content.
    So true, the Google algorithm is becoming quite smart these days. So smart, it's scary. Here's a capture from a video camera at the Google HQ this morning...


    {{ DiscussionBoard.errors[2128227].message }}
  • Profile picture of the author jasonmorgan
    You can never fake the quality of the content.
    So you're telling me that if I hire Steven Hawking and a high school student to each write a 1,000 word essay on physics and put each essay on a separate site that the google bots will know which essay is of higher value?

    No matter how smart the google bots are, they aren't able to judge content quality that well.

    Content is important but you only need to meet a few criteria (which are debatable) to pass the quality content test as far as google is concerned.

    After you've passed that test it's all about on and off page SEO, which is probably around 90% of how and why a site ranks where it does.

    Quality content is a great thing and it's what can make or break a site but I don't think it has very much importance when it comes to actual ranking.

    I've never seen an example of a site ranking well based on content alone.
    Signature

    I'm all about that bass.

    {{ DiscussionBoard.errors[2128669].message }}
    • Profile picture of the author Dean Martin
      Originally Posted by jasonmorgan View Post

      So you're telling me that if I hire Steven Hawking and a high school student to each write a 1,000 word essay on physics and put each essay on a separate site that the google bots will know which essay is of higher value?

      No matter how smart the google bots are, they aren't able to judge content quality that well.
      Not to mention that Google's algorithm may well score the high school student higher thinking that it's more understandable to the average 'googler'

      I see a lot of crap ranked fairly high while some pretty good information lanquishes between page 10 & 100
      {{ DiscussionBoard.errors[2128828].message }}
    • Profile picture of the author Davioli
      Originally Posted by jasonmorgan View Post

      So you're telling me that if I hire Steven Hawking and a high school student to each write a 1,000 word essay on physics and put each essay on a separate site that the google bots will know which essay is of higher value?

      No matter how smart the google bots are, they aren't able to judge content quality that well.

      Content is important but you only need to meet a few criteria (which are debatable) to pass the quality content test as far as google is concerned.

      After you've passed that test it's all about on and off page SEO, which is probably around 90% of how and why a site ranks where it does.

      Quality content is a great thing and it's what can make or break a site but I don't think it has very much importance when it comes to actual ranking.

      I've never seen an example of a site ranking well based on content alone.
      I'm just venturing a guess into this quality judging thing. (personally, I don't believe Google can read quality content.. but you gotta have quality content for good conversion etc)

      Here's my guess though:
      There is a chance Google uses Bounce rates to judge the quality/relevance of the content. While this can be collected through Analytics.. there is also a chance that Google might have some other way of finding out(if analytics is not installed).
      Signature

      {{ DiscussionBoard.errors[2128935].message }}
      • Profile picture of the author Crew Chief
        Originally Posted by petelta View Post

        As time progresses though, so does the algorithm. Things like keyword usage and backlinks can be manipulated and taken advantage of. You can never fake the quality of the content.

        As google's algorithm for recognizing quality progresses, so should the quality of what you produce. One of the best aspects of getting ranked highly in search engines is to provide something better then the already ranked opponents.

        Travis
        Travis, using your assessment, here are two of your paragraphs repackaged...

        As we move forward in time and the Internet evolves, rest assured, so does the technology that Google utilizes. Elements such as keyword density and back linking have been proven to be exploitable. However, it is not possible to fake the level of the quality of one's articles.

        The fact that Google's criteria for detecting quality continues to advance and develop, so must the caliber of the articles and content that you submit. One of the better strategies of obtaining higher rankings in the SERPs would be to offer much better content than your competition who are ranked higher than you are.

        Two questions...

        (1). Does that mean, your competitors can repackage all of your content and use it to then outrank you?

        (2). Which content is better? The original or the repackaged? Let's take a poll people.

        Originally Posted by jasonmorgan View Post

        So I guess the important question would be, what is quality content?

        Quality content is providing the best answer to the questions being searched
        is way too vague. It's a non-answer that a politician would give.

        What content criteria are looked at by google to give a site a higher ranking?

        Macgruber... great film or total crap? You'll get widely different opinions on the content of that film.

        How is a google bot supposed to determine what is quality and what is crap? It's not skynet, it's working off a check list of factors and variables.
        Jason, I found the answer! Google has algorhythmic technology that it now runs all content through including, RSS feeds, forum posts and videos. Yep, this technology can even grade videos! I found the source code to the said technology hidden in this video. Listen real carefully and you can hear the Google algorhythmic processors evaluating the video in the background.

        Signature
        Tools, Strategies and Tactics Used By Savvy Internet Marketers and SEO Pros:

        ProSiteFlippers.com We Build Monetization Ready High-Value Virtual Properties
        {{ DiscussionBoard.errors[2128985].message }}
      • Profile picture of the author Jacob Martus
        Originally Posted by Davioli View Post

        I'm just venturing a guess into this quality judging thing. (personally, I don't believe Google can read quality content.. but you gotta have quality content for good conversion etc)

        Here's my guess though:
        There is a chance Google uses Bounce rates to judge the quality/relevance of the content. While this can be collected through Analytics.. there is also a chance that Google might have some other way of finding out(if analytics is not installed).
        Bounce rate could be easily manipulated through bots to artificially inflate average time on site and bounce rate. I think it is a small factor...but nothing of major influence at all on your rankings.
        {{ DiscussionBoard.errors[2129085].message }}
        • Profile picture of the author Kurt
          Here's how I would do it...If you're going to be a critic, please post your own solution.

          1 On page relevancy

          2 Links - I'd make profile, comment and similar easy-to-automate links worthless. If programs kind find footprints, so can Google.

          3 I'd take the top 80 (or so) sites from these results and mix in 20 new/other sites for testing and the apply the various "people rank" factors I posted above.

          4 Then I'd have human reviewers check the top 10 results for the most profitable 100,000 keywords to check for any crap.

          It isn't an either/or decision to use one type of criteria or another, but a blend is likely the best alternative.
          Signature
          Discover the fastest and easiest ways to create your own valuable products.
          Tons of FREE Public Domain content you can use to make your own content, PLR, digital and POD products.
          {{ DiscussionBoard.errors[2129252].message }}
  • Profile picture of the author remodeler
    Originally Posted by petelta View Post

    It's not keyword placement in any form. It's not the amount or even the quality of backlinks to your page. It's....the quality of the content.Travis
    Originally Posted by petelta View Post

    You have to backlink, you have to have SEO...
    I'm confused...
    {{ DiscussionBoard.errors[2128778].message }}
    • Profile picture of the author petelta
      Originally Posted by remodeler View Post

      I'm confused...
      The point of the whole post is that quality is a growing factor in how well you are getting ranked. at this point and time you are going to need all aspects, BECAUSE that's what the algorithm calls for. But the algorithm changes.

      I bet one day there will be recognition to tell the difference between stephen hawking and a high school student in quality.

      Look at voice recognition and AI software these days. You can have conversations with robots. It's all just formulas.
      Signature
      TEESPRING Student Rakes In Over $116k In Less Than 3 Months
      Niche Pro Profits - How I raked in OVER $120k in 9 months with authority niche sites...

      {{ DiscussionBoard.errors[2128813].message }}
  • Profile picture of the author jasonmorgan
    So I guess the important question would be, what is quality content?

    Quality content is providing the best answer to the questions being searched
    is way too vague. It's a non-answer that a politician would give.

    What content criteria are looked at by google to give a site a higher ranking?

    Macgruber... great film or total crap? You'll get widely different opinions on the content of that film.

    How is a google bot supposed to determine what is quality and what is crap? It's not skynet, it's working off a check list of factors and variables.
    Signature

    I'm all about that bass.

    {{ DiscussionBoard.errors[2128847].message }}
    • Profile picture of the author petelta
      Originally Posted by jasonmorgan View Post

      How is a google bot supposed to determine what is quality and what is crap? It's not skynet, it's working off a check list of factors and variables.
      I totally agree. You can only guess just like you can only guess what combination of backlinks works best, etc.

      I have no idea what they want. Maybe relevant keyword usage, sentence composition, grade level to understand the material. I don't know. But Google is going to advance past what gives results today and they want to deliver quality.

      The whole point of their algorithm is to deliver quality to the searcher. They are going to continue to find variables and factors that determine "better" quality. Like mentioned, at one time better quality meant you had more votes from other sites...backlinks.

      They aren't going to take steps backwards though to ways that can be fooled and manipulated.

      I'm seeing much quicker search engine results these days with the more informative and relevant content I'm posting.
      Signature
      TEESPRING Student Rakes In Over $116k In Less Than 3 Months
      Niche Pro Profits - How I raked in OVER $120k in 9 months with authority niche sites...

      {{ DiscussionBoard.errors[2128875].message }}
    • Profile picture of the author Kurt
      Originally Posted by jasonmorgan View Post

      So I guess the important question would be, what is quality content?

      Quality content is providing the best answer to the questions being searched is way too vague. It's a non-answer that a politician would give.

      What content criteria are looked at by google to give a site a higher ranking?

      Macgruber... great film or total crap? You'll get widely different opinions on the content of that film.

      How is a google bot supposed to determine what is quality and what is crap? It's not skynet, it's working off a check list of factors and variables.
      Here's some possible ways:

      Hire human quality inspectors. I've posted the math/finaces for this many times, and it's very feasible for Google to have real humans monitor their top 100,000 or so most valuable search queries. It's actually dumb if they don't.

      Track click-through rate from the SERPs. Yahoo did a study a few years back (sorry, I've lost the link) that showed that user click-through rates were a lot more accurate for measuring a site than their algo.

      Track click depth - If a person clicks to a site, how many times do they click in that site? Do they click through the site and never return to the SERPs or do they click to a site and after a few seconds click back to the SERPs and then click another site? Returning from a page after a couple of seconds then clicking to another page and staying probably isn't a good thing for the first page.

      Scroll depth - How far did a person scroll down a page?

      Scroll speed - Was the user scrolling down the page at a "reading" speed or a "scanning" speed?

      Was a page reached from the browser's bookmarks or favorites? Typed in the browser address bar? A page reached using browser bookmarks may an indication of quality.

      All of these tracking methods are easily measured via the Google toolbar and are all possible predictors of a page's "quality" as it relates to the search query.

      This stuff isn't new, I wrote about this a few years ago...

      And IMO with links becoming less an indicator of quality and more an indicator of who bought what software, the engines will have to make more of a shift to what I call "People Rank", which is to use more human behavior to determine which sites should rank higher.

      Of course, SEOers will find a way to manipulate these results, and the next round of "spy vs. spy" will continue.

      Signature
      Discover the fastest and easiest ways to create your own valuable products.
      Tons of FREE Public Domain content you can use to make your own content, PLR, digital and POD products.
      {{ DiscussionBoard.errors[2128938].message }}
      • Profile picture of the author bgmacaw
        Originally Posted by Kurt View Post

        Hire human quality inspectors.
        They do this but, based on what I've heard and observed, they're underskilled and overworked. They're given to snap judgments based on how a single page on a website looks, most often the index page, during a 30 second or less visit.

        Originally Posted by Kurt View Post

        Track click-through rate from the SERPs.
        Track click depth
        Scroll depth
        Scroll speed
        Was a page reached from the browser's bookmarks or favorites?
        There are four fundamental problems with these metrics.

        First of all, they can't accurately use many of these metrics on a site that isn't entirely textual or take into account various site designs. For example, a site that is primarily Flash or video content. Also, as you've noted on occasion, some sites act as authority hub sites and will have different usage patterns than other sites.

        Second, user patterns of browsers can be hard to categorize. For example, some people open search results pages into other browser tabs. This would cause pages on back tabs appear to be open more. Different browsers perform differently and may cause different user reactions. Also, not everyone has the Google toolbar installed or logs into a Google account.

        Next, this leads into privacy issues. If Google is measuring this kind of personal details and storing it, this opens huge privacy concerns. Since Google is already under attack on this issue, accelerating this could cause them serious legal and public goodwill issues.

        Lastly, as I assume you noted with you Spy vs. Spy video, if these metrics were to ever become important site quality measurements, they can be gamed rather easily.
        {{ DiscussionBoard.errors[2129073].message }}
        • Profile picture of the author Kurt
          Originally Posted by bgmacaw View Post

          They do this but, based on what I've heard and observed, they're underskilled and overworked. They're given to snap judgments based on how a single page on a website looks, most often the index page, during a 30 second or less visit.

          There are four fundamental problems with these metrics.

          First of all, they can't accurately measure the quality of a site that isn't entirely textual or take into account various site designs. For example, a site that is primarily Flash or video content. Also, as you've noted on occasion, some sites act as authority hub sites and will have different usage patterns than other sites.

          Second, user patterns of browsers can be hard to categorize. For example, some people open search results pages into other browser tabs. This would cause pages on back tabs appear to be open more. Different browsers perform differently and may cause different user reactions. Also, not everyone has the Google toolbar installed or logs into a Google account.

          Next, this leads into privacy issues. If Google is measuring this kind of personal details and storing it, this opens huge privacy concerns. Since Google is already under attack on this issue, accelerating this could cause them serious legal and public goodwill issues.

          Lastly, as I assume you noted with you Spy vs. Spy video, if these metrics were to ever become important site quality measurements, they can be gamed rather easily.
          First...About human editors making snap decisions...Yep, they are probably paid some type of "piece work" and in a hurry. However, saying what they do is one thing, offering a strategy is another, which is why money sites should be able to pass the "30 second quick look" check.

          Of course there are weaknesses to any system where there's finacial gain, including using the data I suggested.

          There's weaknesses to on page algos too, as they can be easily manipulated and can be easlily reverse engineered if this was all they used. That's pretty obvious and pretty easy, we've been doing it for years.

          Link popularity aka "Page Rank" is also flawed. Linking really skews results towards older pages, making it virtually impossible for a new "better" page to outrank an older "lower quality" page. Just like in high school, popularity is not a true indicator of quality.

          And like I said above, now-a-days linking is just about what software you bought, and not about quality. Just because one guy has rumrunner and the other guy doesn't hardly means the first guy's page is better. I'm sure Google is starting to realize this.

          Using algos and link ranking formulas are "artificial" and flawed to begin with. Humans will always be able to detect quality far better than fancy math algos.

          But maybe the biggest problem with on-page and linking is how they can be "semi" reverse engineered, since they are math equations. Adding human elements to the equation also add a good bit of randomness which can not be reverse engineered.

          As far as privacy, what did the TOS say when you downloaded and installed the Google bar? And if they don't tie personal info to the data they collect, and they don't need to, then the privacy issue becomes almost n/a.

          The truth of the matter is, Google isn't a search engine company, it's a data-mining company and has been for years.

          For your concerns about faking it, can you be more specific and give real life examples? Remember, if Google uses the Googlebar for tracking making proxies, clearing cookies, etc is all worthless as Google can (is?) easily add an individual tracking Id for each toolbar.

          Do you set up virtual machines and download the toolbar individually to each, then....? Just asking, as it never hurts to have a contigency plan.
          Signature
          Discover the fastest and easiest ways to create your own valuable products.
          Tons of FREE Public Domain content you can use to make your own content, PLR, digital and POD products.
          {{ DiscussionBoard.errors[2129168].message }}
          • Profile picture of the author bgmacaw
            Originally Posted by Kurt View Post

            As far as privacy, what did the TOS say when you downloaded and installed the Google bar? And if they don't tie personal info to the data they collect, and they don't need to, then the privacy issue becomes almost n/a.
            Privacy is becoming a big issue though, not only for Google but for Facebook and other big names that want as much data as them can collect on users.

            This is what they want when it comes to ad delivery...


            ...and a lot of people don't want a world like that. I think we will see a backlash against this level of data collection soon, if not in the US, in Europe.

            Originally Posted by Kurt View Post

            Their main job could be just to delete obvious crap that makes it past all the other factors and into the top 10.
            I think thefind.com and nextag.com are obvious crap but since they get a lot of their funding from current and former Google employees, well, they get a pass where 'average Joe' thin sites get deindexed.

            I think your human review idea is good, but a bit idealistic since it assumes objective ratings and a nearly unlimited budget. What happens in the real world is they try to do this as cheaply and quickly as possible. I don't think a reviewer sitting in a cube in Bangalore who doesn't have the cultural background or proper subject matter knowledge can accurately rank sites. It would be like you or me trying to score Chinese sites for Baidu. Sure, they'll get some very obvious sites but they're also likely to have many more false positives and clear misses.
            {{ DiscussionBoard.errors[2129464].message }}
            • Profile picture of the author Kurt
              Privacy is becoming a big issue though, not only for Google but for Facebook and other big names that want as much data as them can collect on users.

              It's possible, but again if the data isn't tied to a person, I don't think it's an issue.


              I think thefind.com and nextag.com are obvious crap but since they get a lot of their funding from current and former Google employees, well, they get a pass where 'average Joe' thin sites get deindexed.

              If true, we can logically assume there is human input.


              I think your human review idea is good, but a bit idealistic since it assumes objective ratings

              And PageRank, the concept that everything on the web is "democratic" a link to a site is a vote or "citation" as to the quality of that site, isn't as idealistic?

              and a nearly unlimited budget. What happens in the real world is they try to do this as cheaply and quickly as possible. I don't think a reviewer sitting in a cube in Bangalore who doesn't have the cultural background or proper subject matter knowledge can accurately rank sites.

              Actually this is a better example of the political world which is all about short-term budgets, not the real business world. In the business world, the best practice is to look for the most cost-effective solution not the cheapest, and if someone in India isn't getting the job done for whatever reason, you need another plan.

              ..and a nearly unlimited budget.

              Google does have a nearly unlimited budget...

              Question: Let's assume the SERPs are the most valuable asset Google has and needs to protect them in order to return good, fast, relevant results and avoid outside influences.

              Let's also accept the following as fact:
              According to the press release, Google's revenues for Q1 2010 were $6.77 billion

              Google: Quarterly earnings call reveals increased apps, N1 profitability
              So let's say Google makes $2.25 billion per month.

              How much would you say Google should budget to the project of checking web pages?

              How about just 1% of earnings, let alone gross revenue?

              That would be $22.5 million per month.

              At $10 per hour, that's 2.25 million "man hours".

              1 minute per page (way too much time) x 60 minutes per hour = 60 pages per man hour.

              2,250,000 Man hours
              x 60 Pages check per hour for each "man"
              -------------
              135,000,000

              So, for 1% of their total revenue, Google could have humans check 135 million pages per month. Or, 10 people could each check 13.5 million pages per month.

              Plus, you can probably check a page twice as fast as I posted...And you don't need to check that many pages...

              I bet the total cost to check the most valuable SERPs is less than .1% of total earnings.

              I guess the real question now is, why don't they?
              Signature
              Discover the fastest and easiest ways to create your own valuable products.
              Tons of FREE Public Domain content you can use to make your own content, PLR, digital and POD products.
              {{ DiscussionBoard.errors[2129578].message }}
              • Profile picture of the author bgmacaw
                Originally Posted by Kurt View Post

                Question: Let's assume the SERPs are the most valuable asset Google has and needs to protect them in order to return good, fast, relevant results and avoid outside influences.
                Stockholders will view advertising sales as the most valuable asset with search results being the major delivery system for the ads. If the search results aren't returning ad sales, then they're useless. In fact, there may be some incentive to make ads more relevant and quality controlled than search results. Hmmm..... I hadn't considered that before, especially in the light of recent Adwords bannings of affiliate marketers.

                Originally Posted by Kurt View Post

                So, for 1% of their total revenue, Google could have humans check 135 million pages per month. Or, 10 people could each check 13.5 million pages per month.
                I don't think they would be able to get these kinds of results in the real world. People aren't machines and asking them to make this many subjective decisions over the course of fast paced 12 hour work day will result in an unacceptable level of errors.

                I do understand they'll use visual inspectors to some degree to help them track down stuff the algorithm can't easily handle or make a judgment call on questionable sites but I don't see them deploying it across the board. The inspectors are given an algorithmically generated work list that might contain sites that, for example, are on brand new domains and use a well known MFA or spamming template or are domains that are suddenly ranking for 'cheap viagra' or 'bad credit auto loans'. I can't see it being an affordable project without really narrowing down the inspection list algorithmically like they do now.
                {{ DiscussionBoard.errors[2131569].message }}
                • Profile picture of the author Kurt
                  Originally Posted by bgmacaw View Post

                  Stockholders will view advertising sales as the most valuable asset with search results being the major delivery system for the ads. If the search results aren't returning ad sales, then they're useless. In fact, there may be some incentive to make ads more relevant and quality controlled than search results. Hmmm..... I hadn't considered that before, especially in the light of recent Adwords bannings of affiliate marketers.



                  I don't think they would be able to get these kinds of results in the real world. People aren't machines and asking them to make this many subjective decisions over the course of fast paced 12 hour work day will result in an unacceptable level of errors.

                  I do understand they'll use visual inspectors to some degree to help them track down stuff the algorithm can't easily handle or make a judgment call on questionable sites but I don't see them deploying it across the board. The inspectors are given an algorithmically generated work list that might contain sites that, for example, are on brand new domains and use a well known MFA or spamming template or are domains that are suddenly ranking for 'cheap viagra' or 'bad credit auto loans'. I can't see it being an affordable project without really narrowing down the inspection list algorithmically like they do now.
                  The point of the math was to prove Google could indeed afford human inspectors, contrary to your claim that they needed an "unlimited budget".

                  My example was only one of infinite possibilities...If the workers can't handle 12 hours days, hire 3 times as many and have them work 4 hours days.

                  Or maybe they don't need to review 13 million pages a month...Or maybe they can hire people for less than $10. I just gave ONE EXAMPLE to prove that Google can afford to check pages. Sure there's flaws in my post, it's a casual conversation, not an indepth business plan.

                  As I said, Google wouldn't need to do this for every keyword or for all pages....Only their top money making searches and only after the pages were run through an algo.

                  Like I said in an earlier post, the smart money would be bet that Google is figuring out how to make it work, instead of trying to come up with reasons why it won't.
                  Signature
                  Discover the fastest and easiest ways to create your own valuable products.
                  Tons of FREE Public Domain content you can use to make your own content, PLR, digital and POD products.
                  {{ DiscussionBoard.errors[2132480].message }}
                  • Profile picture of the author bgmacaw
                    Originally Posted by Kurt View Post

                    My example was only one of infinite possibilities...If the workers can't handle 12 hours days, hire 3 times as many and have them work 4 hours days.
                    You sound like a former IT VP I worked with a while ago. He had a manufacturing background and almost no IT experience. He wondered if a programming project I was managing would go faster if we had two or three shifts of programmers working on it.

                    I do agree with your idea of them employing human inspectors and improving their inspection process but I think there are some budgetary and manpower limits to it that make it less practical on a large scale than it sounds on paper. But, hey, in most of my programming career I've done reporting for accounting managers and VPs, the infamous "Dr No's" in most organizations. That probably colors my thinking a lot when it comes to ambitious projects.

                    Overall, I think that the kind of technology and human rankings you're talking about are too complex for them to pull off on a large scale for at least 5 years, if not longer. Add in the fact that they're essentially well on their way to becoming a "big public corporation" with layers of bureaucracy to deal with and advertising revenues to protect and expand at all costs, not to mention significant legal problems with privacy and anti-trust, it makes it even less likely to be implemented beyond a testing scale anytime soon.
                    {{ DiscussionBoard.errors[2133168].message }}
    • Profile picture of the author tjcocker
      Originally Posted by jasonmorgan View Post

      Macgruber... great film or total crap?
      The answer to that is great film, obviously.

      Or am I the only one that's been walking around for the past 3 weeks shouting "MacGruber!" at random moments? C'mon?
      Signature
      Initrode Consulting -Boulder SEO, Copywriting, Editing, Website design, etc...
      {{ DiscussionBoard.errors[2128961].message }}
  • Profile picture of the author pandorasbox
    Ok, I do not know what planet your on but without an alien, a telapath or full AI which does not exist for search engines in aspect that it can recognize quality. Dear god, most of google search results are crap that is outdated and some of it is a bunch of worthless sites. The main aspect in ranking is amount of content, keyword usage and on-site optimization. The second part is how many links you have. Quality has nothing to do with it. Quality has to do with getting new or retaining clients.
    {{ DiscussionBoard.errors[2129025].message }}
  • Profile picture of the author Jacob Martus
    The other problem with using human reviewers to judge quality and relevance is that many times you are going to get LOTS of quality pages. Allowing a person to make the judgement call on which one is most relevant would skew results worse than anything else.

    You can't eliminate bias from anyone. A reviewer might like a website more than another and for that reason rank it higher.
    {{ DiscussionBoard.errors[2129266].message }}
    • Profile picture of the author Kurt
      Originally Posted by Jacob Martus View Post

      The other problem with using human reviewers to judge quality and relevance is that many times you are going to get LOTS of quality pages. Allowing a person to make the judgement call on which one is most relevant would skew results worse than anything else.

      You can't eliminate bias from anyone. A reviewer might like a website more than another and for that reason rank it higher.
      Reviewers per say don't need to rate pages. Their main job could be just to delete obvious crap that makes it past all the other factors and into the top 10.

      Plus, each SERPs can checked by multiple reviewers with scoring similar to Olympic diving...Toss out the high score, the low score, then average the rest. Mix this in with the other algo factors so the humans don't have total control and no human scoring is the result of one person.
      Signature
      Discover the fastest and easiest ways to create your own valuable products.
      Tons of FREE Public Domain content you can use to make your own content, PLR, digital and POD products.
      {{ DiscussionBoard.errors[2129291].message }}
      • Profile picture of the author Jacob Martus
        No matter what when you're mixing a human element to the equation you are introducing something that could potentially skew results and hurt legitimate sites...at least in my opinion.

        I think if it was feasible + the best idea that Google would already be doing it that way. They've got all the money in the damn world to know how and return the most relevant results.

        I totally get what you're saying but whenever there are humans involved there are errors involved too.
        {{ DiscussionBoard.errors[2129352].message }}
        • Profile picture of the author Kurt
          Originally Posted by Jacob Martus View Post

          No matter what when you're mixing a human element to the equation you are introducing something that could potentially skew results and hurt legitimate sites...at least in my opinion.

          I think if it was feasible + the best idea that Google would already be doing it that way. They've got all the money in the damn world to know how and return the most relevant results.

          I totally get what you're saying but whenever there are humans involved there are errors involved too.
          There is no perfect system...Someone brought up how Google could judge quality and I gave some suggestions.

          While using human factors has its weaknesses, so does the present system. This system is broken and the handwriting is on the wall...

          Take a look around and you'll see Google and Yahoo are both just about impossible to use now to sign up for free email addresses. And who uses free email addresses enmasse? SEOers getting links and creating profiles.

          Look around and you can find sites selling a MILLION yahoo accounts. What does anyone use 1,000,000 Yahoo email accounts? To spam. And I don't think it's a coincidence both Yahoo and Gmail are also search engines.

          Bloggers can't have comments turned on any more because they are being spammed to death.

          Bookmarking services now almost all use nofollow.

          Popular forum/community scripts like Kickapps are making all links nofollow.

          Scraper programs that can mass-submit are cheaper and more popular than ever.

          Programs like ubot allow people like me (non programmer) to build automation "bots" making mass link submission available to even more people.

          And this very Warrior Forum has banned these spammy type WSOs.

          People are getting very angry and starting to take action. We can bring up all the problems with using people as part of the ranking process, but this linking thing is getting out of hand and making a lot of people very, very angry.

          The way the "system" is now, it rewards those that are willing to vandalize the sites, blogs and pages of others. Something has to change.

          And I promise that Google isn't dismissing "people rank-like" strategies because of the flaws. Instead, I'll bet anything that they are working on finding solutions for the weaknesses.

          (I also believe Google has been using "People Rank" of some kind for a while, on their big searches...It will just become a bigger factor in the rankings as time goes on)
          Signature
          Discover the fastest and easiest ways to create your own valuable products.
          Tons of FREE Public Domain content you can use to make your own content, PLR, digital and POD products.
          {{ DiscussionBoard.errors[2129425].message }}
  • Profile picture of the author culvers
    whats so wrong with the current system? I can pretty much find everything I am looking for somewhere on the first page. Does it really matter if the "best site" is positioned number 8 instead of number 1? As long as the user can find it quickly i dont see the problem.

    *edit* to add that there is no "best site" people will have different opinions. People will also be looking for different things when typing in a search term, e.g someone searching for "blue widgets" may be looking to buy, whereas another person searching for "blue widgets" may just want to find reviews or information.
    Signature

    {{ DiscussionBoard.errors[2129483].message }}
  • Profile picture of the author Tom Goodwin
    Umm.....




    No.



    You can think whatever you want about how YOU would rank websites, but that is not how Google ranks websites. You can rank crap content quite easily with decent backlinks.



    Originally Posted by petelta View Post

    Okay, it's not new at all. It's more just getting more and more recognition as time goes by. It's not keyword placement in any form. It's not the amount or even the quality of backlinks to your page. It's....the quality of the content.

    You have to look at what Google actually is. They are a service that provides their users with answers to their questions. Google wants more then anything to provide their users with the best quality content to their answers.

    Now, the internet is so large, they can't do this with actual people. They use an algorithm and bots to do the major legwork to find the best quality results. In the past, the factors have been keyword usage, backlinks, etc. That's because the algorithm made it so.

    As time progresses though, so does the algorithm. Things like keyword usage and backlinks can be manipulated and taken advantage of. You can never fake the quality of the content.

    As google's algorithm for recognizing quality progresses, so should the quality of what you produce. One of the best aspects of getting ranked highly in search engines is to provide something better then the already ranked opponents.

    Travis
    {{ DiscussionBoard.errors[2130261].message }}
  • Profile picture of the author Gary Becks
    My only question is where people come up with this type of stuff??:confused:
    {{ DiscussionBoard.errors[2130499].message }}
  • Profile picture of the author petelta
    Lol the replies from some of you are just hilarious. I love how people get so offensive in the SEO forum. Don't think of this as such factual but more theoretical.

    I was looking at it as what Google tries to do for their users and what direction we can guess they are going to go in with their algorithm. TODAY, you can get plenty of success with backlinks and on page seo, but it changes quite often. The ideal algorithm will only give their users a quality piece of content though.

    Stop trying to define quality...we all know what a quality piece of content is...it's not crap. It answers what the reader is searching for. That's it. No reason to over complicate it.

    Travis
    Signature
    TEESPRING Student Rakes In Over $116k In Less Than 3 Months
    Niche Pro Profits - How I raked in OVER $120k in 9 months with authority niche sites...

    {{ DiscussionBoard.errors[2131274].message }}
  • Profile picture of the author jasonmorgan
    Don't think of this as such factual but more theoretical.
    The NEW number 1 aspect to getting top ranked in search engines
    You threw the first pitch.

    I love how people get so offensive in the SEO forum
    Not nearly as uptight as some of the people in the main forum.

    Quality content and google rankings : Does it make a difference would have been a better title.

    If you're gonna make a bold statement you need to be prepared to back it up.

    This place is already packed with weekend seo warriors who don't accomplish much aside from parroting something they found on some random blog written by another weekend seo warrior. Not to mention the knuckleheads who bought a couple of Angela's backlink packs and are pitching themselves as SEO gurus.
    Signature

    I'm all about that bass.

    {{ DiscussionBoard.errors[2131588].message }}
  • Profile picture of the author jacksonlin
    What is quality anyway?

    A. What people want to read in order to make themselves feel better?

    B. Or do they want to read the same old rehashed cliches to make themselves happier?

    C. Or do they want to hear the real truth? But it's going to hurt their feelings.

    Which one is quality?

    I can guarantee you 95% of people will choose option A or B and call that quality, but in reality, it's C they need to have a good look at, but they will call it rubbish.

    For example, I'm in the dating niche. You won't believe the amount of people who like to give me better dating tips, but they are the ones online typing "how to attract men" or "how to attract women" after being disgruntled and unhappy about reading an answer along the lines of C.

    So quality to them, equates to whatever makes them feel better.

    So what is quality may I ask?
    Signature
    Want a 13 Part FREE Internet Marketing Course - Taught By A PREMIER CLICKBANK SUPPER AFFILIATE? Did I mention taught through VIDEOS?
    Yup, I'm not hyping things up for you. Click here to check it out!
    {{ DiscussionBoard.errors[2131628].message }}
  • Profile picture of the author jasonmorgan
    Is this not enough proof for you? Think outside the box a bit. If it were backlinks, page rank, domain age etc that were the most important for ranking a website, you would NEVER outrank an article or forum or directory or anything else.
    Well...

    I wasn't aware that articles, directories and forums were competition. I've never had a problem stepping past them.
    Signature

    I'm all about that bass.

    {{ DiscussionBoard.errors[2131713].message }}
  • Profile picture of the author Negotiator74
    "Quality" content is subjective and can not be filtered, judged, or ranked by a search engine. As smart as Google is....they aren't this smart....you are giving them way too much credit.
    {{ DiscussionBoard.errors[2131742].message }}
  • Profile picture of the author jasonmorgan
    And as far as the outranking an article comment, I have outranked articles which DO HAVE backlinks to the particular article, not just the directory.

    Therefore in those particular situations, the ONE THING that ranked me higher was nothing more than the content of my site.
    but...

    how many backlinks did your own page/site have?
    how many backlinks did the article page have?
    is that article chasing after the same keyword as you?
    how well done or poor are other on page seo factors for both sites?

    you cant just say it was the content when it could have been a combination of a number of things as to why you outranked it.
    Signature

    I'm all about that bass.

    {{ DiscussionBoard.errors[2131744].message }}
  • Profile picture of the author jasonmorgan
    ON PAGE SEO is only the start to what's being called "quality content"
    You're shifting the topic to fit what your vision of "quality content" and SEO are.

    I do a (sorta) agree with many of your and the OP's points

    QUALITY content is becoming immensely more important than it used to be.
    The QUALITY AND UNIQUENESS of the content has an impact much larger than it used to.
    I think we can all agree what google or any search engines intent is. To provide the most relevant and quality search results they can.

    Yes, they are probably moving in the direction of the items I quoted. Are they there? I don't think they are as close as others might think. I'm sure that they want to move more in that direction.

    • SPELLING
    • GRAMMAR
    • PARAGRAPH STRUCTURE
    • LANGUAGE
    • UNIQUE CONTENT
    • EASY NAVIGATION
    • IMAGES LABELED WITH RELEVANT AND DESCRIPTIVE TEXT IN ALT/DESCRTIPTION FIELDS
    • AND PROBABLY MORE
    You mention these items as signs of quality content but many of them are technical issues and don't necessarily equal quality. These are elements that can easily be manipulated.

    The same goes for on page SEO... it's really a blueprint of steps to be followed to get the best results. a)keyword in title b)keyword in h1 c)keyword used in text d)keyword in alt tag etc...

    So really, once you figure out what a search engine is looking at and considers quality or important, it's easy to manipulate. You can still throw out spun crap content and as long as you are following your blueprint for on page seo and other page elements that makes google happy you'll have favorable SE results.

    Which in the end, kills the quality content theory and it's really just figuring out what an SE wants to see on a page. Put all of the right pieces in the right places and you'll make an SE Bot happy.

    And, this is pretty much what all SEO's do... figure out how to tweak their pages and manipulate content to get SE bot love. What started with repeating "sex" 1,000 times in the footer of your page has now evolved to a new bag of tricks.

    And with all of that said... a couple hundred bulletproof high quality kick ass backlinks will trump just about anything you can do on-page when it comes to SEO and ranking.
    Signature

    I'm all about that bass.

    {{ DiscussionBoard.errors[2132347].message }}
  • Profile picture of the author Kurt
    A point being over-looked is that Yahoo did a study showing that user click data returns much better results (as judged by humans) than did a ranking algo.

    The claims are that this can easily be cheated (although no one has said specifically how to cheat the google bar), however this doesn't dismiss the point that the SEs do have ways to value "quality".

    These ways may be tampered with, but few are making that argument and those that do don't discuss the same flaws in what is considered the "accepted" algo, where it's assumed no quality scores are included.

    The simple truth is people will always be able to judge quality better than a math formula and a combinationn of both is probably optimal, as it can screen out a high percentage of junk automatically, then use some sort of human factors.

    And human factors are not limited to human reviewers. As I pointed out, there are other ways to determine quality.

    Here's how you can do it: For any given keyword query, the "standard" algo is used to bring up the results.

    Of the top 20, about 15 sites are already tested as being "good" by tracking human before as I listed above. The other 5 are included as tests to see how they compete against the 15 pages that are known to be "good".

    Based on what the past 1000 people did, you use Baye's formula for probablity to predict what the next 1000 people will do. If 1000 real people indicate they like one page more than another, chances are the next 1000 people will also like that page. The pages that are "well liked" move up and others move down.

    Of course this will have to update as things change. A search for "BP oil" two months ago would require one type of info to be "good" (stock quotes?), while today searchers for "BP oil" are likely looking for something else.

    This will work, and this will work well if cheating can be taken care of...Which BTW, Google spends about 90% of their resources for their SERPs on spam control and removal, not on "ranking" using the present "algo", so it isn't like abuse doesn't already take place.

    The best results may be a mixture...Links, ON page, human elements all have pros and cons and I don't see any reason to rely soley on any 1 or 2 of them.

    How about if the top 20 returned a couple of pages that had a lot of "link juice"? And a couple of other pages were tight with the On Page relevancy...And a few others did really well with the human factors? Then a few pages did really well with another formula that balanced all factors?

    Not only do I say these results would be better, they would also be just about impossible to reverse engineer.

    Why should all the SERPs be based on the same criteria?

    And just for the sake of discussion, let's assume Google does use some type of "people rank"...The last thing they would do is announce it.
    Signature
    Discover the fastest and easiest ways to create your own valuable products.
    Tons of FREE Public Domain content you can use to make your own content, PLR, digital and POD products.
    {{ DiscussionBoard.errors[2132545].message }}
  • Profile picture of the author JackPowers
    Google can judge quality content by incorporating factors such as bounce rate into the algorithm.

    I personally think that this is why a site getting lots of backlinks may surge briefly in the SERPS for then to drop out fast is the bounce rate is high.

    There are also other factors such as brand recognition. If people are keying in the name of your site a lot or maybe even bookmarking the page with Chrome.

    How about when a user searches for a keyword and then bounces from site to site until landing at your site and staying there. That would signify that your site has the most relevant content.

    So, while I don't think there's any getting around backlinks, Google certainly have other methods than that to judge the quality of content.
    Signature

    {{ DiscussionBoard.errors[2132608].message }}
    • Profile picture of the author Jacob Martus
      Originally Posted by JackPowers View Post

      Google can judge quality content by incorporating factors such as bounce rate into the algorithm.

      I personally think that this is why a site getting lots of backlinks may surge briefly in the SERPS for then to drop out fast is the bounce rate is high.

      There are also other factors such as brand recognition. If people are keying in the name of your site a lot or maybe even bookmarking the page with Chrome.

      How about when a user searches for a keyword and then bounces from site to site until landing at your site and staying there. That would signify that your site has the most relevant content.

      So, while I don't think there's any getting around backlinks, Google certainly have other methods than that to judge the quality of content.
      Bounce Rate will never be an important metric. It might have a tiny tiny tiny of a fraction of an impact but that's it. Otherwise people would just use bots to decrease bounce rate and increase time on site. That eliminates your first and third factor as a quality metric.

      The bookmarking or brand recognition I don't know about. Not that many people use chrome in comparison to Firefox and people would still be bookmarking their own content through chrome and probably having their family and friends do it also.

      I'm sure there are ways for them to do better...but not using those metrics.

      .
      {{ DiscussionBoard.errors[2133041].message }}
    • Profile picture of the author Adam Roy
      Originally Posted by JackPowers View Post

      Google can judge quality content by incorporating factors such as bounce rate into the algorithm.

      I personally think that this is why a site getting lots of backlinks may surge briefly in the SERPS for then to drop out fast is the bounce rate is high.

      There are also other factors such as brand recognition. If people are keying in the name of your site a lot or maybe even bookmarking the page with Chrome.

      How about when a user searches for a keyword and then bounces from site to site until landing at your site and staying there. That would signify that your site has the most relevant content.

      So, while I don't think there's any getting around backlinks, Google certainly have other methods than that to judge the quality of content.
      I'm not sure why, but I didn't think about the bounce rate having an effect on serps.

      But it makes perfect sense I don't know why I didn't think of that.

      That's a great point you got me thinking
      {{ DiscussionBoard.errors[2133417].message }}
  • Profile picture of the author hisnibbs
    "Quality" is also a subjective measurement. You may have really well produced content but if it doesn;t answer my requirements then I would say it's of not use and therefore zero quality. But a piece of badly written junk with the 1 fact Iwas looking for. Bingo!

    Not sure how too measure that though, bounce rates maybe but I suppose Google has to make a judgement call on the content as to whether or not it is "quality" and the factors that they use although "Subjective" are important to us SEO people. A good way of attempting to get into the mindset of the Google Quality judgement is to go look at pages that rank higher than you and analyse just waht on the page might be considered "Quality" and then attmpt to replicate or better that (and no, don't just rip of their copy!). But layout, number and quantity of paragraphs etc. I am looking at all of this at the moment and it's very interesting what I am finding. I'm going to probably put a report together soon about it as I think I may have found one of the elements to "quality". But I may just keep it to myself for a bit first ;-)
    {{ DiscussionBoard.errors[2340455].message }}
  • Profile picture of the author Steven Heron
    The main way that Google determines good quality content vs bad quality content is inbound links. End of story. What a load of filler.
    {{ DiscussionBoard.errors[2340521].message }}
  • Profile picture of the author Aron Levin
    I'm not sure that there's any actual proof to back this up, but I totally agree.

    As long as you're actually providing some sort of value to your readers = quality, you'll do way better than the auto-created junk that's out there.

    With Google Toolbar and Google Analytics it's very easy to track how users are browsing your website. Are they ever coming back to the site? How many seconds are they spending on each page? Do they bounce from the page after landing on it and re-defines their search query/click on another page?

    Example:
    If page a) has position 2) and page b) has position 1) and people that visits page a) returns to their search result and click page b) i'm sure they'll track and keep that in their database to determine that page b) on position 2) is more relevant, thus ranking it up to number 1) after a while.

    I've been saying it over and over again to the SEO folks out there:

    Creating user friendly websites and valuable content is far more important than stuffing your content with keywords and submitting your articles to directories. If you write great content, or even decent content with valuable information you'll have people linking to your content, clicking the facebook like button and re-tweet button (and telling their friends and followers about your stuff.) and that's far more valuable than a link in a article directory.
    {{ DiscussionBoard.errors[2340570].message }}
  • Profile picture of the author rahulr
    The answer is only one Backlinks,Backlinks,Backlinks.
    Signature
    {$5 ONLY}750 word article written on your keyword.
    GET IT NOW -http://goo.gl/gXH4m
    {{ DiscussionBoard.errors[2340857].message }}
    • Profile picture of the author Universal_Soul
      I too think that chanting about the "quality" content is a bit.... weird. It must be some kind of rationalization I believe.

      The correct word is: popular. The word is probably coming from engineers of google and some 'quality score' algorythms:
      - if it's duplicate of something that is already there
      - how long people will stay there
      - and of course how much people will vote for it - which is links

      Then we have more subtle elements, like ie how 'valuable' (in terms of authority) the domains that vote for site X are.

      So sometimes it might have to do with quality, but very rarely, because most of the people don't want quality. Most of the people do want crap....
      {{ DiscussionBoard.errors[2479611].message }}
      • Profile picture of the author BishopMartin
        I don' know how anyone can argue that Google does not at least try to evaluate the 'quality' or value of a page and then use this evaluation to determine, in part at least, how it ranks pages.

        PageRank was/is an evaluation of quality/value. More links going to a page means more people must value it or think its quality. It should be obvious that Google will continue to find more methods to evaluate quality.

        I agree with those saying that Google can't determine a pages quality based on sentence structure, grammar, paragraph composition, originality, etc. As stated, there are simply to many examples of gibberish/pig Latin/lorem ipsum that rank well to believe this is currently a major (or even minor) factor. But, I don't see why its hard to believe that they are trying to do this, and that it could eventually become a more important factor. Microsoft Word can assess grammar in documents as well as word/phrase variety and reading level. I'm sure Google is testing similar things to evaluate written content.

        Outside of the quality of the writing there other more practical ways Google could evaluate quality.

        - CTR to your site from the SERP
        - Number of Bookmarks
        - Feed Subscribers
        - Time on Site
        - Pages Viewed
        - Bounce Rates (or pogosticking)

        Google has stated that they take into account over 200 'signals' that determine where a page stands in their results page. If Google eventually determines that one of these signals (or a combination of them) is a better indicator of quality than links and on-page SEO, then it seems obvious these 'qaulity' factors would start to supplant some of the other factors.

        All that having been said, on-page seo and links still rule. If any of the above factors count, they sure don't count for much, yet.

        BMartin
        {{ DiscussionBoard.errors[2482519].message }}
  • Profile picture of the author Steven Carl Kelly
    Yeah, Google knows quality. That's why when you try the highly competitive term

    quick weight loss

    you find this page ranked #1

    Quick Weight Loss Centers - Weight Loss Programs, Products, Diet Pills

    and this one ranked #2

    Fast Weight Loss Tips - How To Lose Weight Fast

    and THIS one on page 10:

    Quick Weight Loss Best for Long-Term Success? - MedicineNet - Health and Medical Information Produced by Doctors

    Quality is clearly the winner there. Or not.
    Signature
    Read this SURPRISING REPORT Before You Buy ANY WSO! Click Here
    FREE REPORT: Split Test Your Landing Pages the Easy Way
    {{ DiscussionBoard.errors[2482582].message }}
    • Profile picture of the author Universal_Soul
      Originally Posted by Steven Carl Kelly View Post

      Yeah, Google knows quality. That's why when you try the highly competitive term

      quick weight loss

      you find this page ranked #1

      Quick Weight Loss Centers - Weight Loss Programs, Products, Diet Pills

      and this one ranked #2

      Fast Weight Loss Tips - How To Lose Weight Fast

      and THIS one on page 10:

      Quick Weight Loss Best for Long-Term Success? - MedicineNet - Health and Medical Information Produced by Doctors

      Quality is clearly the winner there. Or not.
      Heh)

      I see the same thing mate....
      I don't know what all these guys talking about quality are on, but this must be a good stuff, and I want some of it

      In my niche number one on term.... hmmm lets say "plastic aprons" is a guy who doesn't even sell them!)

      I can show you niches when it's not only a CRAP that's occupying the first position, but it's in addition COMPLETELY IRRELEVANT.

      As per quality, I just read a good post of Rand Fishkin on SEOMoz blog. Everyone is surprised that google is lowering standards so much. There is a hope for 'good boys' that mr g will slap 'bad boys' at some point, but it's going way too far for my liking.

      What's happening with rankings is completely irrelevant, manipulative and with no roots in real world.

      Now everybody have to do the same crap - we're going back to link farming and cloaking as far I'm aware

      JJ
      {{ DiscussionBoard.errors[2484225].message }}
    • Profile picture of the author Universal_Soul
      Originally Posted by Steven Carl Kelly View Post

      I love position number two.
      I can tell you now the best technique for being at no1 in google for weight loss Go to the same link farms and hire Chineese guys for $5/hour or manufacture this crap yourself.

      The problem with this kind of ranking is that this sites - despite no1 ranking - won't get any good conversion rate anyways. And good sites that could actually SELL something (those) believing in 'quality content') are buried deep behind the spammers in the ranking.

      I start to believe - by reading topics and advices here - that it's a spammers forum. And it probably is. Good news is: spam is back big time.


      All the best

      JJ
      {{ DiscussionBoard.errors[2484254].message }}
  • Profile picture of the author Fernando Veloso
    Not again this discussion... LMAO Maybe people should ask Google? Oh wait, Steven did just that! And the winner is ^^^?
    Signature
    People make good money selling to the rich. But the rich got rich selling to the masses.
    {{ DiscussionBoard.errors[2482599].message }}
  • Profile picture of the author Marketing Ignite
    You are definitely right about the quality content..fresh and high quality is what they want to present. However, if you look at companies that are number one for top competitive keywords, majority of them do some sort of intelligent link building...
    Signature

    Digital Marketing Consultant since 1998. Contact me for a free consultation.
    https://www.marketingignite.com

    {{ DiscussionBoard.errors[2484275].message }}
  • Profile picture of the author Universal_Soul
    Originally Posted by petelta View Post

    It's....the quality of the content.

    You have to look at what Google actually is. They are a service that provides their users with answers to their questions. Google wants more then anything to provide their users with the best quality content to their answers.

    You can never fake the quality of the content.

    Travis
    And this is their best answer for a query: weight loss tips

    Fast Weight Loss Tips - How To Lose Weight Fast

    you're right - google is all about quality and you can't fool them. Dream on, just tell me what drugs you're on, I'll go and get some for me

    JJ
    {{ DiscussionBoard.errors[2484282].message }}
  • Profile picture of the author Gemelo
    The backlinks still metters,Google just has changed the way to look them,is our job figured out again.Rank website by link popularity is why Google is the bigger SE of this world
    Signature

    Great plans to weight loss

    {{ DiscussionBoard.errors[6650333].message }}
  • Profile picture of the author StarrManUK
    way to bump a very old thread...
    {{ DiscussionBoard.errors[6650504].message }}

Trending Topics