Why backlinking will never die

8 replies
  • SEO
  • |
Hello fellow Warriors, I originally posted this in the thread "Backlinks are going to die", but felt perhaps it is deserving of it's own. Lately I have been seeing a lot of people's delusions about Google someday being able to decide quality of a site based on content alone, thus being able to eventually do away with backlinks, and thought I would put in my 2c.. well, it started out as 2c but quickly turned into about $1.75

So, here's the big question. What exactly is quality content? Besides relevance of subject, grammatically correct, formatted neatly, not obvious autospun garbage, etc etc, how will Google ever be able to judge a quality site or piece of content?

When you visit a site and read an article about a certain subject, 9 times out of 10 (unless it's some specific how-to, like how to change an alternator), the article is basically just a large opinion written in paragraph form. And that opinion isn't something that an algorithm can say 'this is quality' or 'this is stupid' or 'we'll rank you higher than the next site because my binary sensors agree with you'. An opinion can only be formulated, and agreed or disagreed with, by the human mind.

For sites like entertainment or e-commerce sites, it's still the people's opinion that determines whether it's the best store to buy from, or the best site to read about the celebrity gossip. A computer algorithm will never be able to determine the quality and popularity of the site by it's content alone. It needs people to tell them that, and the only way for people to tell them that is by:

a - counting the number of links
b - the relevance of where the links are coming from, and
c - the number of visits to that site via those links.

Of course, they need to keep their system running smoothly, constantly tweaking their algorithm to weed out the junk.. in essence, gigantic 'filters' to keep their system 'clean'. But this post isn't about how they count the links, or how they deem a link valuable or not, or how much.. its about the future existence of links, or people's popular 'votes'.

Even in the case of the factual how-to site, it's still the people's vote that "This is the best source about how to change your alternator in a 1998 Cadillac Seville". Two different how-to's could give basically the same direction, but the algorithm can't tell whether its a bright idea or not to remove the head gasket first. Only people (and their votes/links) can tell that.

Thats why they need to use links as the base ranking factor of their whole system. Because links are human 'votes' for agreeing or disagreeing with a usefulness of a site. Only humans can judge that, and that is what makes a page rank higher. The higher it is, the easier it is to find, the more people say "Google it" instead of "Yahoo it" or "Bing it" (because google gives you the right answer every time), and the more Google and their shareholders profit.

Just like any successful business, Google and it's system needs interactivity with people.. it can't just write a clever algorithm and keep it churning away on its own, letting the computer decide what is quality and what is not.. it needs people to help their system determine quality and relevance. That can only be gauged by links.

Even social signals can't do it without links. You don't see anyone tweeting or facebooking "Hey, theres this awesome website that I found, if you google 'car stereos', go to page 46, 6th link down, you click that and thats the page".. I suppose you could, but really you just put the link. That link is the signal, and if that link gets posted enough places by enough people, that site obviously goes higher.

Even if Google's algorithm evolves into some sort of Artificial Intelligence in the year 3629, it would then be ranking sites based on Google's 'opinion' of whether a site is relevant or not, and that would be only one of a million other 'opinions' as to whether a site is the most helpful and popular to it's users.

In conclusion to my post, which has now become a thesis, Google spends an enormous amount of time, money, and resources into determining whether the source of the links are in fact coming from real users. In the end, they want to deliver the most relevant results, for the people, by the people. That only proves the point that linking in itself isn't going away.

Google's entire business model is not necessarily based on fact, but based on popularity. People will always be their most significant determining factor, and the links they post are the headcount. The more people agree with the content on your site, the more 'it' figures your site is the most relevant and helpful, thus ranking it higher.. it's 'content' isn't better, per se, but it is more 'popular'. Just like politics, top spot goes to the popular vote.. the links are the ballots that the people cast.
#backlinking #die
  • Profile picture of the author Blakos
    The short answer would be:

    "Because that is how spiders work, they use links to crawl the web"

    It is then when Google's algorithms come into play.
    {{ DiscussionBoard.errors[8133549].message }}
  • Profile picture of the author Joshua Lowenthal
    We have to remember though that Google uses humans as well to browse the web and determine various factors about websites and their general content.

    Spiders may be able to see that you have a lot of links coming into your website, but ultimately if a human turk gives a bad report to your website in the search rankings, chances are Google will rank you lower than other websites.

    Google Manual Reveals Humans Who Manually Rate Results

    In today's SEO world only a lot of unique, well written, well organized, easily browse-able content is going to give you the upper advantage.

    It doesn't matter how many people you lead to a well for a drink, if it's dried up than it's dried up. Eventually someone at G will notice and stop helping you send people to a dry well.

    The point is if you have a nice cool freshwater well and people are thirsty, they will let everyone know where to go to get some water. THIS is what back-linking is supposed to be like. If you are building your own back-links then they are NOT truly genuine, because YOU made them. It's about what everyone else thinks, not you.
    {{ DiscussionBoard.errors[8133560].message }}
    • Profile picture of the author Blakos
      Originally Posted by Joshua Lowenthal View Post

      We have to remember though that Google uses humans as well to browse the web and determine various factors about websites and their general content.

      Spiders may be able to see that you have a lot of links coming into your website, but ultimately if a human turk gives a bad report to your website in the search rankings, chances are Google will rank you lower than other websites.

      Google Manual Reveals Humans Who Manually Rate Results
      I wouldn't be surprised if they used the statistical information gathered to help develop there algorithms.
      {{ DiscussionBoard.errors[8133571].message }}
      • Profile picture of the author Joshua Lowenthal
        I can almost say without a doubt they use that intelligence.

        It can get tricky though if you think about it considering humans have bias and ego, whereas technology can only do what it is told to.

        A fine balance is what Google needs to find.

        I personally have a much deeper view of how Google operates. I talked about it on a thread I made here, but I'm not going to link people away from OPs thread.

        I think there is a method to the madness at Google and no one can figure out what or why because Google is essentially THE tech giant.

        So they do what they want when they want, and it isn't to make our lives any better that's for sure.
        {{ DiscussionBoard.errors[8133592].message }}
    • Profile picture of the author jinx1221
      Originally Posted by Joshua Lowenthal View Post

      The point is if you have a nice cool freshwater well and people are thirsty, they will let everyone know where to go to get some water. THIS is what back-linking is supposed to be like. If you are building your own back-links then they are NOT truly genuine, because YOU made them. It's about what everyone else thinks, not you.
      I both agree and disagree on this one. If 200 pages before yours are misinformed, and yours is the right answer, how are people going to find your site without you self promoting your own site in some way as to get the ball rolling? SEO is 'search engine optimization'. All forms of SEO, from link building to putting keywords in descriptions and page titles, are attempts to get your site noticed, and higher than the rest. Whos to say what forms of SEO is right or wrong?

      Thats the problem with this paranoia mindset that all these Penguin updates are causing. People fearing doing anything to promote or link to their own site, in fear of a penalty or deindexing or Google mailing anthrax to their home, to the point that they believe that "Content" and "Content Alone" will rank their site. It won't. Build it and, sorry to say, they might come, but just as sure, they wont know the directions to get there unless you show them the way.
      Signature

      The Ultimate Private Network Management,
      Visualization and Automation Tool




      {{ DiscussionBoard.errors[8133713].message }}
      • Profile picture of the author andrewkar
        it will track our behavior and flow of traffic in a real time. Forget about static links and stuff like that...

        Right now Google can differentiate between actually real authority sites and "wannabe" authority sites...

        Google analytics provides real time data already...


        So, it's going to be a huge fun
        Signature
        Do what you want to do!
        {{ DiscussionBoard.errors[8133824].message }}
        • Profile picture of the author jinx1221
          Originally Posted by andrewkar View Post

          Right now Google can differentiate between actually real authority sites and "wannabe" authority sites...
          How's that, besides possible domain age? How does the system know what is "real" and what is "wannabe"? Because one might have better css or site design?
          Signature

          The Ultimate Private Network Management,
          Visualization and Automation Tool




          {{ DiscussionBoard.errors[8135901].message }}
  • Profile picture of the author jinx1221
    Good points. It's not that well-written, compelling content isn't what it takes to rank well - people naturally link to good sites - its that the system alone can't tell what is quality and not. Thats where they need the system and humans (and links) to work hand in hand.

    Put it this way, in the how-to automotive site, for example:

    Say 'Site A' has 600 pages, all unique, well written content. 'Site B' has 2 pages, no pictures, thin, step by step, there's your answer.

    The algorithm naturally says 'Site A' is higher quality and better, that should rank much higher than 'Site B'.

    Now someone enters into google: "How to change your flat tire"

    'Site B' says:

    1) Take tire iron
    2) Unbolt tire from car
    3) Take spare tire out of trunk
    4) Bolt spare tire on car
    5) Put flat tire in trunk

    Ooookayyyy.. BUT... 'Site A' (the 'quality' site) says:

    "Today we are going to teach you, step by step, by trained technicians, how to change your flat tire. First, go to the kitchen and grab a butcher knife. Make sure it is stainless steel, as to not scratch the paint off your wheel wells. Next, carefully cut the rubber from around the rims. Make sure you have the car jacked up on cardboard boxes.. this will allow the air to continue flowing underneath your car, which is very important and approved by the United Car Owners association of America." etc etc

    Page after page of, well, basically B.S.

    The algorithm alone has no way to tell which of the two are quality and not. Human logic would obviously say that 'Site B' is of higher quality, whereas the computer would say 'Site A' is.

    They need human indicators (in the form of links) to say that, since nobody is linking to 'Site A' but are linking to 'Site B', 'Site B' though thin and lower "quality", is the one we want to show in our index. In fact, showing 'Site A' in our index could actually hurt our business because people would not be finding what they want to know, and go next door to find it.
    Signature

    The Ultimate Private Network Management,
    Visualization and Automation Tool




    {{ DiscussionBoard.errors[8133647].message }}

Trending Topics