What Might "Google Pelican" Bring To The Table?

9 replies
  • SEO
  • |
FIRST: Apologies for the linkbait title. For link bait it is. What follows is IMO only. If you're happy with that read on. If not, I'm sorry for getting your engine revved. Might be worth taking a look anyway though...

Sorry for the cheeky title, "Google Pelican"? What's that?

Could it be the a new named update from Google? Nope - sorry (or phew - what a relief more like) no new algorithm has been released, at least not at the time of writing.

This is supposition, and "Pelican" seems to fit the theme of animals beginning with the letter "P", So that's what I've gone with.

Another Shocking Update? Only If Your An Ostrich!

I'm a cynic at heart, the only thing that really surprises me about Google's updates is the order in which they come, There's a lot of problems in the index, you can see them, they can see them. It's just a matter of which ones they want to address first. Perhaps a couple of surprises might be how late they are to react to many of them, and how ultimately ineffective these fixes tend to be in the fullness of time.

Google are planning some big changes. Are you ready?For those new to SEO, it might be understandable to consider that Penguin and Panda were the "be-all-and-end-all" of algorithm updates. But that isn't the case, there have been dozens, most of them have been named and discussed over the years. Check out (for example) "Fritz", "Dewey" and "Caffeine".

Nobody talks about these much as we approach 2014, which is odd as they are still an integral part of the indexing and ranking process, and to ignore them would mean, at worst, to risk jeopardising your site's position in search engine rankings, and at best, just missing an opportunity to improve a little on your current position.

That there will be another update soon, and that update will be given a name of some sort is a given. It will definitely happen, probably sooner rather than later.

Sticking With The Zeitgeist

Concentrating on Panda and Penguin for now, let's see what their goals were;

  • Panda set out with the aim to analyse on-page content more closely, look for linguistic anomalies and plagiarism and then penalised sites that had previously benefited from broadcasting or using sub-prime content.
  • Penguin did a similar job with links. Looking at the distribution of links and their surrounding content. All aspects of links. The anchor text (natch), platform, chronology of link building background, surrounding text and the context they were used. Even down to how they might be repeated or reposted in social media.

When I read about "Shocking updates" and "Google cleaning house at last" it's a face-palm moment for me.

Should we be shocked, or even pretend to be when Google produce yet another update proclaiming to analyse content more thoroughly?

Is it news to anyone who has been in the business more than a year or two to discover that they tweak their algorithms to check the providence of links with a view to penalising excessive self-promotion?

It's not a shock to me. A mild surprise on the day a new update is named and released maybe, but that's about it. They never announce the actual release date and are vague about specifics, but the over arching purpose of these updates is as old as the hills and the fact that they are released should "shock" no one.

If either of these ideas are a surprise to anybody in SEO, they're in the wrong business. There is nothing shocking, new or even particularly innovative about either Penguin or Panda. They are part of an iterative process that has been on-going for at least 13 years, and will continue.

But what about the next update? We can guess it will again have a major effect on the spam content users or, alternatively, punish the most egregious of self-promotional link builders, but what exact methods might Google employ. What levers will they pull, which specific metrics will be the ones analysed?

Is there anything we can guess before hand? Might it be possible to make a good guess now at what's coming and take the time to reposition our sites and online presence to avoid being penalized?

Here, I've looked into my crystal ball and have five specific ideas that I'm sure the engineers at Google are at least looking at if not currently actively countering. Remember, these are just my guesses. Fingers crossed that I don't make a complete fool of myself. Hopefully I'll at least get one or two of them right.
Analysis Of Sites With An Excessive Number Of Do f0llow Links

I'm putting this one first, as to me, it's almost a given. Do F0llow links up until very recently offered by far the best search engine ranking potential of anything in offiste SEO. To this day private networks offer them by the thousand, rented by the month. Services and spammers seek them out and use them in preference to those tagged with "rel = nofollow".

But there's a problem. Over the past two years the entire Internet is moving away from do follow. Open source platforms and closed bespoke websites are in being redesigned ,updated or just edited to include the no follow tag on all outbound links at an increasingly rapid rate. Now, over 80% of all links that Google analyse have the no follow tag attached.

So what are they to think of the site where this Pareto is reversed? A site with over 80% do follow links, and in some cases hundreds if not thousands of them?

A situation like that is exceedingly unlikely to have occurred naturally. It is a huge red flag of self-promotion. By putting the hammer down for commercial sites with a high percentage of Do F0llow links, Google will go a long way to destroying the paid link networks that they so vigorously target as well as wiping out the most excessive of solo link spammers into the bargain. A real win-win situation for them, and one that would be simple to implement. Expect it!

A Reduction In The Value Of Facebook Signals

The last 18 months has seen an increase in the value of social noise. In particular those from Facebook. While the social media network does offer a decent level of proof that any website community is active and contributory, it also raises a major commercial problem for Google themselves.

Facebook is Googles main CPC ad competitor

They survive on advertising revenue primarily via "pay per click". Facebook are now their major rival. To value the social signals from FB so highly is, via the back door, actively promoting their major commercial rival.

In other words it's costing them dollars, probably, over time, tens of millions of them.

This likely won't be a major slap, but I imagine that overlarge benefits from Facebook social signals will be levelled in a similar way that EMD's were last year.

Domain Age Benefit Will Return

The last six months have seen brand-new sites, even ones with poor content and overtly self promotional linking backgrounds sneaking their way onto page one for some very commercial terms. Often knocking valuable resource websites down in the process. It's long been thought that domain age and activity have played an important part in establishing trust, but the last six months domain age and established good practices seem to have actually counted against many sites. This is likely been an error or overcompensation for other factors on Google's part, and one they are certainly aware of.

Age itself is actually a macro factor of three other elements;

  • The amount of content the site has (the more the better).
  • The activity of the site over time (updates are good, establishing a pattern of consistent updates takes time).
  • Finally the trust and authority the site has, which both take time to accrue.

None of these three elements seem to matter at all for these new sites dominating mid to long term commercial terms since the Spring of 2013.

Matt Cutts has actually denied age played a major role, then given a list of factors that are important, none of which could be achieved without the passage of considerable time. Time which he then says is measured. Talk about answering a riddle with a question

It's not difficult to see how a situation like this would occur. Going back to the updates I mentioned at the top of the post, one in particular, "Caffeine" was designed to scour the web more quickly, find new content and index much faster than had previously been the case. Getting it indexed and sometimes ranked well in days rather than weeks. The idea was to value activity over stagnation.

Balanced against this is the important metric of age and trust. The tipping point has wavered first one way and then the other over time. Currently it over-balances towards new sites, giving them high prominence regardless of their on-site and off-site SEO.

In the past new URLs, pages or posts on established sites have been the way forward. Now brand new sites are winning, often offering little or no value in the process.

The old method of "new URL's on trusted sites" is the way Matt Cutts commented on before as the preferred method.

Establish your sites credentials and then remain active.
Somewhere in the last 6 months the first part of that maxim has been lost in many niches.

Remaining active also meant the transfer of "Page rank" through the newer URL's of a well linked site would be a natural process. Everyone was a winner (if they played the game right).

However, I suspect that it is Page rank, and the repositioning of it as it reaches the end of its patented life that may well be the cause of this current shift. Expect it to be corrected.

While we're on the subject...
New Node Based Trust And Authority Measure

The patents of Page rank has less than two years to run. In June 2015 it expires, and will become public domain. Stanford University own the patents, and Google uses it under licence (contrary to what many believe, Google do not own and did not invent Page rank).

It's unlikely that the search engine giant will want to use an open source method for authority and trust distribution. This puts them in a tricky situation.

Developing their own in-house system that is very similar to Page rank could lead to a slew of legal cases challenging their right to use such a similar process if others adopt something similar "look and feel" needs to be sufficiently different and unique to stand a chance of holding up in law. They might end up with nothing like the exclusivity that they are likely to want.

But on the other hand, inventing an entirely new method for the attribution of authority and trust could mean an entire sea-change bigger than any individual algorithm we have seen before in terms of SERPs

Whatever method is used, expect the next few updates to implement changes that will pave the way for the new system. Some may not eve
n make sense in the short term. We may only have that "A ha" moment once the entire system is revealed and makes sense of the situation.

Efforts To Re-Launch Google Plus As A More Personal Platform For Interaction

Google have a long-standing problem, and they know it. They do not engage their target audience in a loyalty commanding or emotional manner. Unlike Twitter or Facebook, very few friendships are made or maintained on Google's social platform "+1".

People use the search engine out of habit. Probably clicking into the tools menu of their browser and selecting it as the default opening page, then promptly forgetting about it. It's a service that finds sites, no more no less. Or at least it was...

These days a user is likely to type in a search query and be presented with results surrounded with an increasing number of advertisements along the top and right hand side. Google's footprint for self promotion is increasing all the time. For many terms there is not one natural search result on Page one.

Ask yourself this:

"Do I want to use a service that just presents me with its own advertisements? At what point does it stop being a search engine and become "junk mail".. Not only junk mail, but junk mail that I'm (for some bizarre reason) actively opting to have shoved in my face when there are better, free alternatives available?
I can't pretend to speak for everybody, but if Google disappeared tomorrow, it wouldn't affect on a personal level me one iota. I don't have any emotional involvement with them and I don't know anybody who does. Even Hollywood failed to inspire people to "care". If the competitors such as Yahoo or Bing offered better search results, and didn't drown their users under self-promotion, people will switch to those engines. It's only habit and not quality of service that keeps people with Google.

It costs nothing to switch, it takes seconds to do. And if you have no emotional involvement in a product, why would you care? Why would anybody care? Why would you continue to use a substandard product when a better one is available for free that you can switch in ten seconds?

The answer they are likely to come up with is to promote their own "+1" To give it more relevance. An attempt to get people to really engage with them on a social level and want to stay with Google for some other reason besides "habit". Because just relying on habit, while your competitors have better products and engage in a more emotional way (and swapping costs nothing) is a very dangerous place for them to be.

Ok, that's my 5 predictions. And the name "Google Pelican" hxxp://www.demondemon.com/2013/09/09/introducing-the-google-pelican-algorithm/? Wild guess. Might pop down the local betting shop to see if I can get decent odds on it though.

Scritty
#bring #facebook #google +1 #google pelican #page rank #table
  • Profile picture of the author nik0
    Banned
    What stops Google from renewing/extending the license with that university? Money won't be the problem.

    How you came up with the 80% number for nofollow links?

    Just some numbers:

    Yahoo.com
    dofollow 41,844,051
    nofollow 18,594,020

    Amazon.com
    dofollow 753,581,097
    nofollow 220,424,390

    Warriorforum.com
    dofollow 485,742
    nofollow 299,040

    That's more in the range of 60-70% dofollow so seems to me like you're just taking some wild guesses. Even if you pulled the number from somewhere it still has no value as you can clearly see that all the authority sites have a much larger number of dofollow links then nofollow links.

    Seems to me that if you're a typical IM'er blasting links with something like UltimateDemon, like you're promoting in your sig, that you then end up with 80% nofollow links, which means you're just a spammer, now that's leaving a footprint so I would rather stick with the majority being dofollow to simulate the authority sites back link profile.

    If you disagree with me then please show me 3 authority sites where the majority of links is based on nofollow links. There are billions of websites out there, sure the majority of authority sites might put a nofollow tag nowadays but there are not so many authorities per niche.

    To top it off:

    Facebook
    dofollow 9,252,927,099
    nofollow 1,388,447,921

    Quite a sample right?

    And here some local site ranking at #2 for lawyer Atlanta

    hornsbylaw.com
    dofollow 114,164
    nofollow 971

    No idea how Google could flag 80% nofollow links as natural

    Or is this part of your plan to promote UltimateDemon, a software that creates 80% nofollow links, now or perhaps in the very near future? You call yourself an innovator, while promoting softwares that hardly anyone dares to use these days in fear of penalties due to spammy links. Way to go man!
    {{ DiscussionBoard.errors[8492433].message }}
    • Profile picture of the author Scritty
      Originally Posted by nik0 View Post

      What stops Google from renewing/extending the license with that university? Money won't be the problem.

      How you came up with the 80% number for nofollow linsk?
      It's not the license that is the issue. It's the patent exclusivity. In 2015 Page rank will be open source.
      My thinking is that MOZ has repositioned itself (from SEOMOZ to just MOZ) and developed a very striong and increasingly accepted trust measure DA/PA and that in the near future Google will absorb MOZ or at least thgose metrics.

      The 80% is Google's own figure from their latest investor call (actually they said "close to 80%".
      On that, at what point does "follow" become entirely redundant, so rare and hard to get that entire niches have no websites with them as either inbound or outboud? What's the point of measuring it then? Wouldn't they knowingly just be measuring self promotion or blind luck?
      Do they want to reward self promotion or luck with increased SERPs?

      Those sources for follow you quote are not particularly representative. Yahoo does not use the tag at all (and actively discourages it) SOme of the biggest footprints on the internet are entirely nofollow
      Wordpress.com is 100% nofollow - there are 70 million of those sites, 13 billion URL's (as are an increasing number of self hosted WP)

      70 MILLION wordpress hosted sites

      Since 1.5 all wordpress.org (self hosted) visitor created links have been nofollow be default and the most popular SEO plugins for this and other CMS/blogging platforms have nofollow changes coded in to them by default
      In the past 2 years we've seen
      Squidoo
      Ezine
      Most self hosted WIKI and article platforms

      All roll out updates which add the nofollow tag by default, they are just the tip of the iceberg.

      Check these and other stats for yourself.
      Great input. No point writing anything if it's not going to be checke and questioned
      Thanks given

      Scritty
      {{ DiscussionBoard.errors[8492477].message }}
      • Profile picture of the author nik0
        Banned
        Originally Posted by Scritty View Post

        It's not the license that is the issue. It's the patent exclusivity. In 2015 Page rank will be open source.
        My thinking is that MOZ has repositioned itself (from SEOMOZ to just MOZ) and developed a very striong and increasingly accepted trust measure DA/PA and that in the near future Google will absorb MOZ or at least thgose metrics.

        The 80% is Google's own figure from their latest investor call (actually they said "close to 80%".
        On that, at what point does "follow" become entirely redundant, so rare and hard to get that entire niches have no websites with them as either inbound or outboud? What's the point of measuring it then? Wouldn't they knowingly just be measuring self promotion or blind luck?
        Do they want to reward self promotion or luck with increased SERPs?

        Those sources for follow you quote are not particularly representative. Yahoo does not use the tag at all (and actively discourages it) SOme of the biggest footprints on the internet are entirely nofollow
        Wordpress.com is 100% nofollow - there are 70 million of those sites, 13 billion URL's (as are an increasing number of self hosted WP)

        70 MILLION wordpress hosted sites

        Since 1.5 all wordpress.org (self hosted) visitor created links have been nofollow be default and the most popular SEO plugins for this and other CMS/blogging platforms have nofollow changes coded in to them by default
        In the past 2 years we've seen
        Squidoo
        Ezine
        Most self hosted WIKI and article platforms

        All roll out updates which add the nofollow tag by default, they are just the tip of the iceberg.

        Check these and other stats for yourself.
        Great input. No point writing anything if it's not going to be checke and questioned
        Thanks given

        Scritty
        You can say what you want but the hard data shows that the number is as flawed as it can be.

        Take a look at Facebook again, a platform where a lot of users should be using Wordpress right, as it's mostly the normal people that use that, still they have the highest % of dofollow links!

        What does Yahoo have to do with using the tag or not? These are the back links that point at Yahoo, Facebook and so on. Facebook has 11 Billion back links, not representative?
        {{ DiscussionBoard.errors[8492503].message }}
      • Profile picture of the author nik0
        Banned
        Originally Posted by Scritty View Post

        Since 1.5 all wordpress.org (self hosted) visitor created links have been nofollow be default and the most popular SEO plugins for this and other CMS/blogging platforms have nofollow changes coded in to them by default
        Friend, only the blog comment section have been nofollowed by default, and blog comments are about the worse links you can get.

        I never ran into an SEO plugin that on default turned my contextual links into nofollow, do you have some examples of that perhaps?

        Ok I read you talk about visitor created links, who cares about visitor created links, those make up a tiny % of links. Unless you allow all spam comments to be posted but then I like to remind you that Askimet is also installed on default.

        Why I start about this?

        Cause you make the mistake to start comparing nofollow blog comments to contextual links at private network sites.

        To grab Facebook as an example again:

        4 million referring domains
        11 billion backlinks

        70 million self hosted wordpress sites existing on the web

        9 billion dofollow
        1 billion nofollow

        If WP had the capability to influence the numbers this hard then it should reflect in the 90/10 dofollow/nofollow ratio of Facebook right? I mean you could assume that with only 4 million referring domains and 11 billion back links (2000 times more) that a huge portion would come from Wordpress, perhaps 500 million links?

        Anyway, enough said, back to work!
        {{ DiscussionBoard.errors[8492534].message }}
        • Profile picture of the author Scritty
          Originally Posted by nik0 View Post

          Friend, only the blog comment section have been nofollowed by default, and blog comments are about the worse links you can get.

          I never ran into an SEO plugin that on default turned my contextual links into nofollow, do you have some examples of that perhaps?

          Ok I read you talk about visitor created links, who cares about visitor created links, those make up a tiny % of links. Unless you allow all spam comments to be posted but then I like to remind you that Askimet is also installed on default.
          I completely understand where you are coming from but..
          Sorry - I've no idea where I mentioned a qualitative measure of links?
          If you saw that somewhere (I still can't) then fine. I was talking quantity not quality

          I have to pick up one point. In isolation no link is any better or worse than any other, That's pretty basic SEO 101.
          Scrapebox and Xrumer spoiled the patch for blog and forum links respectively, people making tens or hundreds of thousands of them inside a few hours.

          What was wrong then was their overall link profile.

          It's one of those commonly heald misconceptions that "forum and blog links are bad". No they aren't Having zillions of them made in "blasts" are bad (which is precisely what many people did).
          Having an appropriate and proportional number made over time on contextual websites is about as good as any other link type
          Taken individually there was nothing really wrong with each link.
          As a collective, having 25000 blog links and little or nothing else was a sure sign of self promotion. It presented Google with a situation that would never occur naturally.

          In the same way very soon an overlarge number of Dofollow links will provide them with a very similar self promotional signal, which they may well respond to in a similar fashion (it is all conjecture remember)

          As for plugin that turns nofollow try Yoast (the most widely used SEO plugin) for a start.

          While we may quibble about whether the %age is in the high 60's or high 70's that's really not the point.

          The "rel=nofollow" tag is NOT there be default. By default all links are Dofollow. The fact that so many have CHOSEN to turn on tag and (and lets take your figure - the lower figure why not? The argument changes not one jot if I decide to concede that just to move the discussion along)

          Over time nofollow is gaining traction. Google have a record of understanding the factors that those who look to "game" serps use. The accumulation of unnaturally large numbers of dofollow links is one method.
          There is every chance that Google will want to level this

          I'd rather know why you might think Google will let this go on than argue over 10% that means very little (and while conceding it for the sake of this argument, still contest that it is closer to 80% at least according to MC)

          So why would Google be happy to let this easily visible method of self promotion continue to be rewarded when leveling it with their own metric (the nofollow flag is their "toy") would be "push button" simple?

          If you have an answer to that one, I'd love to hear it.
          {{ DiscussionBoard.errors[8492602].message }}
          • Profile picture of the author nik0
            Banned
            Originally Posted by Scritty View Post

            I completely understand where you are coming from but..
            Sorry - I've no idea where I mentioned a qualitative measure of links?
            If you saw that somewhere (I still can't) then fine. I was talking quantity not quality
            Where do I talk about quality?

            In your original post you talked about private blog networks and later on you come up with nofollow blog comments.

            Chose a path and stick with it dude.

            And no Yoast does not turn links in contextual posts (I refer to the private blog networks again that you mentioned) into nofollow so again your are looking for ways out. It makes no sense to discuss with you so I leave it here.

            80% or 90% dofollow is not unnaturally large when large authority sites like Twitter and Facebook show these exact same numbers in their back link profiles.

            Nofollow only gained traction at the sites that got massively spammed by softwares like SenukeX, Scrapebox, Ultimate Demon and so on. You only look in your small limited IM world of Wordpress, Ezines and a couple of others.

            If any of what you said was true then the numbers would look completely different, and clearly that's not the case.

            Anyway good luck promoting some spam software Mr. Innovator.
            {{ DiscussionBoard.errors[8493841].message }}
      • Profile picture of the author MikeFriedman
        Originally Posted by Scritty View Post

        The 80% is Google's own figure from their latest investor call (actually they said "close to 80%".
        Do you some further proof of this? I was on their investor call. Never remember such a thing being said. I have gone through the transcript. Can't find a thing about it. In fact, I don't remember them ever really talking about links on any investor calls.

        Matt Cutts has in the past stated that about 1-2%, maybe 3% of all the links they crawl carry the 'nofollow' attribute.

        Here is a video from just last year, where he again states that nofollow links are a single digit percentage of all the links on the web.


        So I would love to know where you pulled this 80% stuff from?

        Also your prediction that Google is going to use PA/DA over PR is insane. PA and DA are junk. Highly inaccurate garbage.

        Google has no reason to change from PR. It does not matter one bit if their patent expires in 2 years because they have not been using that same formula for quite awhile to calculate PR. They don't have to divulge what they are using either.

        It's no different than McDonald's not having to divulge the "secret sauce" recipe in Big Mac's and Coke keeping its recipe under tight wraps.

        I would just love to know where you came up with this 80% nonsense.
        Signature

        For SEO news, discussions, tactics, and more.
        {{ DiscussionBoard.errors[8494093].message }}
  • Profile picture of the author nik0
    Banned
    Nice fact about the 1-3% last year.

    For sure that I ain't gonna ever read a post from him again.
    {{ DiscussionBoard.errors[8494112].message }}
  • Profile picture of the author jinx1221
    Originally Posted by Scritty View Post

    People use the search engine out of habit.

    I can’t pretend to speak for everybody, but if Google disappeared tomorrow, it wouldn’t affect on a personal level me one iota. I don’t have any emotional involvement with them and I don’t know anybody who does. Why would anybody care? Why would you continue to use a substandard product when a better one is available for free that you can switch in ten seconds?

    Because just relying on habit, while your competitors have better products and engage in a more emotional way (and swapping costs nothing) is a very dangerous place for them to be.
    It wouldn't affect me on a personal level or anyone else either. But neither would it be if McDonalds disappeared. Or Marlboro, or Folgers. The consumer uses these out of habit too.. and for convenience, or just the name alone. They would simply switch to something else if they had to. Why do people use Photoshop over Gimp? Gimp's free and is just as good.

    Why use Google? For me, because it's quick, it's my homepage, I've been using it for years, and 9 times out of 10 it finds what I want it to. Switch to a different search engine? Aint' nobody got time fo' dat.

    For Google, it's this habit that is the safest place for them to be, because the average consumer operates out of habit, and it is very difficult to break a habit. That's why we're focusing on Google rather than Bing or Yahoo, because that is where the consumer is. I might dare say that marketing is all about targeting the habitual nature of consumers.

    Btw, all statistics aside I liked the article, thanks.. off to grab a Big mac, cup o' joe and a smoke

    *edit- however, if Coca Cola disappeared, I would be pissed.. but that's just me
    Signature

    The Ultimate Private Network Management,
    Visualization and Automation Tool




    {{ DiscussionBoard.errors[8494184].message }}

Trending Topics