Just a thought on Google algorithms

8 replies
This is just a thought I had, maybe it's completely wrong, but I thought I'd throw it out there.

As most people know, the thing Google tries to go for in its search algorithms is quality... It's always trying to reward "good content" and punish spam, and so on and so on.

But I had this idea... if Google could ever really come up with an algorithm that was able to identify "good content" against "bad," wouldn't that basically be the point at which computers could just write all our content for us?

I mean, couldn't the same equations used in this algorithm just be turned around to write quality articles on any topic?
#algorithms #google #thought
  • Profile picture of the author David Keith
    i dont thing the computer algorithm will ever actually get to that point. the trend over the last 10 years is to actually put less weight to the actuall content, and more weight into how actuall people use and interact with the content.

    at first, things like meta tags and keyword densities were all SE ranked sites by. and now, its evolved to the point where its primarily a backlink focus. although thats being exploited for now just like meta tags were. the algorithm will continue to evolve and weight how real people feel about the content more and more.

    its harder to trick a bunch of real people than a computer algorithm looking for a specific set of values.
    {{ DiscussionBoard.errors[5246031].message }}
    • Profile picture of the author The Copy Warriors
      Originally Posted by David Keith View Post

      i dont thing the computer algorithm will ever actually get to that point. the trend over the last 10 years is to actually put less weight to the actuall content, and more weight into how actuall people use and interact with the content.

      at first, things like meta tags and keyword densities were all SE ranked sites by. and now, its evolved to the point where its primarily a backlink focus. although thats being exploited for now just like meta tags were. the algorithm will continue to evolve and weight how real people feel about the content more and more.

      its harder to trick a bunch of real people than a computer algorithm looking for a specific set of valuables.
      does it factor in how long people stay on the page? i think that would make sense
      {{ DiscussionBoard.errors[5246081].message }}
  • Profile picture of the author alexxcans
    You got the idea but still computers are limited.
    {{ DiscussionBoard.errors[5246099].message }}
  • Profile picture of the author DailyHealthBlitz
    I think Google's algorithm will continue to evolve....and as at the same time so will the strategy of Internet Marketer's. I am amazed at what I see programers come up with to work around Google's rules. It's like a chess match with no winner.
    Signature

    Founder, Owner, CEO, President, Head Honcho, Big Cheese and Chief Janitor of http://www.DailyHealthBlitz.com- the first daily deal site dedicated to health and fitness

    {{ DiscussionBoard.errors[5246151].message }}
  • Profile picture of the author Vlad Romanov
    I am confused what point your are trying to make. Google's algorithm doesn't bump "original content" to the first spot automatically. Unique content is obviously better, but how does knowing that fact allows a computer write unique content?

    I mean I am completely confused, are you saying that because Google like original content, your computer will be able to write an article about visiting a resort in the Bahamas?

    Anyhow... Good luck with ur thought,
    -Vlad
    Signature
    {{ DiscussionBoard.errors[5246188].message }}
  • Profile picture of the author Bryan V
    What's 'good' is just what people want, and may change over time too. So taking user input such as backlinks, social metrics, user history, and things like the 'freshness' update all try to put human input into the algorithm.

    Sometimes the thing people want is something they already know, or expect to be out there...like recipes, photos, videos, guides, the newest column from your favorite writer, etc.

    I suppose for some searches, content could be auto-generated and even given personalities too..... but that's kind of out of the realm of what they're doing here.
    Signature
    Perhaps an attic I shall seek.
    {{ DiscussionBoard.errors[5246288].message }}
  • Profile picture of the author caseycase
    Originally Posted by Ken_Caudill View Post

    There are style checkers, spell checkers and parsers capable of determining reading level. I wouldn't be surprised if those things evolved into an algorithm with the ability to determine a loose approximation of quality --or at least a readability factor.

    I don't know if we'll ever get a program with the ability to measure syntax.

    My guess is that the time spent on a site and page views might carry more weight than most people think. It seems to me they would be pretty good indicators of reader involvement and general quality.

    Copy written for machines makes me nauseous. I believe most people react that way. Thus Panda.
    Agreed. I think the biggest thing that keeps them from applying a readability factor (assuming they don't right now) is the computing power necessary. However, that could change in the future. But, it may not even be necessary, because the time on site and page views might be enough.
    Signature

    Free IM Info, No Junk - http://www.ironcladim.com



    {{ DiscussionBoard.errors[5246423].message }}

Trending Topics