What Might "Google Pelican" Bring To The Table?
- SEO |
Sorry for the cheeky title, "Google Pelican"? What's that?
Could it be the a new named update from Google? Nope - sorry (or phew - what a relief more like) no new algorithm has been released, at least not at the time of writing.
This is supposition, and "Pelican" seems to fit the theme of animals beginning with the letter "P", So that's what I've gone with.
I'm a cynic at heart, the only thing that really surprises me about Google's updates is the order in which they come, There's a lot of problems in the index, you can see them, they can see them. It's just a matter of which ones they want to address first. Perhaps a couple of surprises might be how late they are to react to many of them, and how ultimately ineffective these fixes tend to be in the fullness of time.
Google are planning some big changes. Are you ready?For those new to SEO, it might be understandable to consider that Penguin and Panda were the "be-all-and-end-all" of algorithm updates. But that isn't the case, there have been dozens, most of them have been named and discussed over the years. Check out (for example) "Fritz", "Dewey" and "Caffeine".
Nobody talks about these much as we approach 2014, which is odd as they are still an integral part of the indexing and ranking process, and to ignore them would mean, at worst, to risk jeopardising your site's position in search engine rankings, and at best, just missing an opportunity to improve a little on your current position.
That there will be another update soon, and that update will be given a name of some sort is a given. It will definitely happen, probably sooner rather than later.
Concentrating on Panda and Penguin for now, let's see what their goals were;
- Panda set out with the aim to analyse on-page content more closely, look for linguistic anomalies and plagiarism and then penalised sites that had previously benefited from broadcasting or using sub-prime content.
- Penguin did a similar job with links. Looking at the distribution of links and their surrounding content. All aspects of links. The anchor text (natch), platform, chronology of link building background, surrounding text and the context they were used. Even down to how they might be repeated or reposted in social media.
When I read about "Shocking updates" and "Google cleaning house at last" it's a face-palm moment for me.
Should we be shocked, or even pretend to be when Google produce yet another update proclaiming to analyse content more thoroughly?
Is it news to anyone who has been in the business more than a year or two to discover that they tweak their algorithms to check the providence of links with a view to penalising excessive self-promotion?
It's not a shock to me. A mild surprise on the day a new update is named and released maybe, but that's about it. They never announce the actual release date and are vague about specifics, but the over arching purpose of these updates is as old as the hills and the fact that they are released should "shock" no one.
If either of these ideas are a surprise to anybody in SEO, they're in the wrong business. There is nothing shocking, new or even particularly innovative about either Penguin or Panda. They are part of an iterative process that has been on-going for at least 13 years, and will continue.
But what about the next update? We can guess it will again have a major effect on the spam content users or, alternatively, punish the most egregious of self-promotional link builders, but what exact methods might Google employ. What levers will they pull, which specific metrics will be the ones analysed?
Is there anything we can guess before hand? Might it be possible to make a good guess now at what's coming and take the time to reposition our sites and online presence to avoid being penalized?
Here, I've looked into my crystal ball and have five specific ideas that I'm sure the engineers at Google are at least looking at if not currently actively countering. Remember, these are just my guesses. Fingers crossed that I don't make a complete fool of myself. Hopefully I'll at least get one or two of them right.
I'm putting this one first, as to me, it's almost a given. Do F0llow links up until very recently offered by far the best search engine ranking potential of anything in offiste SEO. To this day private networks offer them by the thousand, rented by the month. Services and spammers seek them out and use them in preference to those tagged with "rel = nofollow".
But there's a problem. Over the past two years the entire Internet is moving away from do follow. Open source platforms and closed bespoke websites are in being redesigned ,updated or just edited to include the no follow tag on all outbound links at an increasingly rapid rate. Now, over 80% of all links that Google analyse have the no follow tag attached.
So what are they to think of the site where this Pareto is reversed? A site with over 80% do follow links, and in some cases hundreds if not thousands of them?
A situation like that is exceedingly unlikely to have occurred naturally. It is a huge red flag of self-promotion. By putting the hammer down for commercial sites with a high percentage of Do F0llow links, Google will go a long way to destroying the paid link networks that they so vigorously target as well as wiping out the most excessive of solo link spammers into the bargain. A real win-win situation for them, and one that would be simple to implement. Expect it!
The last 18 months has seen an increase in the value of social noise. In particular those from Facebook. While the social media network does offer a decent level of proof that any website community is active and contributory, it also raises a major commercial problem for Google themselves.
Facebook is Googles main CPC ad competitor
They survive on advertising revenue primarily via "pay per click". Facebook are now their major rival. To value the social signals from FB so highly is, via the back door, actively promoting their major commercial rival.
In other words it's costing them dollars, probably, over time, tens of millions of them.
This likely won't be a major slap, but I imagine that overlarge benefits from Facebook social signals will be levelled in a similar way that EMD's were last year.
The last six months have seen brand-new sites, even ones with poor content and overtly self promotional linking backgrounds sneaking their way onto page one for some very commercial terms. Often knocking valuable resource websites down in the process. It's long been thought that domain age and activity have played an important part in establishing trust, but the last six months domain age and established good practices seem to have actually counted against many sites. This is likely been an error or overcompensation for other factors on Google's part, and one they are certainly aware of.
Age itself is actually a macro factor of three other elements;
- The amount of content the site has (the more the better).
- The activity of the site over time (updates are good, establishing a pattern of consistent updates takes time).
- Finally the trust and authority the site has, which both take time to accrue.
None of these three elements seem to matter at all for these new sites dominating mid to long term commercial terms since the Spring of 2013.
Matt Cutts has actually denied age played a major role, then given a list of factors that are important, none of which could be achieved without the passage of considerable time. Time which he then says is measured. Talk about answering a riddle with a question
It's not difficult to see how a situation like this would occur. Going back to the updates I mentioned at the top of the post, one in particular, "Caffeine" was designed to scour the web more quickly, find new content and index much faster than had previously been the case. Getting it indexed and sometimes ranked well in days rather than weeks. The idea was to value activity over stagnation.
Balanced against this is the important metric of age and trust. The tipping point has wavered first one way and then the other over time. Currently it over-balances towards new sites, giving them high prominence regardless of their on-site and off-site SEO.
In the past new URLs, pages or posts on established sites have been the way forward. Now brand new sites are winning, often offering little or no value in the process.
The old method of "new URL's on trusted sites" is the way Matt Cutts commented on before as the preferred method.
Establish your sites credentials and then remain active. |
Remaining active also meant the transfer of "Page rank" through the newer URL's of a well linked site would be a natural process. Everyone was a winner (if they played the game right).
However, I suspect that it is Page rank, and the repositioning of it as it reaches the end of its patented life that may well be the cause of this current shift. Expect it to be corrected.
While we're on the subject...
The patents of Page rank has less than two years to run. In June 2015 it expires, and will become public domain. Stanford University own the patents, and Google uses it under licence (contrary to what many believe, Google do not own and did not invent Page rank).
It's unlikely that the search engine giant will want to use an open source method for authority and trust distribution. This puts them in a tricky situation.
Developing their own in-house system that is very similar to Page rank could lead to a slew of legal cases challenging their right to use such a similar process if others adopt something similar "look and feel" needs to be sufficiently different and unique to stand a chance of holding up in law. They might end up with nothing like the exclusivity that they are likely to want.
But on the other hand, inventing an entirely new method for the attribution of authority and trust could mean an entire sea-change bigger than any individual algorithm we have seen before in terms of SERPs
Whatever method is used, expect the next few updates to implement changes that will pave the way for the new system. Some may not eve
n make sense in the short term. We may only have that "A ha" moment once the entire system is revealed and makes sense of the situation.
Google have a long-standing problem, and they know it. They do not engage their target audience in a loyalty commanding or emotional manner. Unlike Twitter or Facebook, very few friendships are made or maintained on Google's social platform "+1".
People use the search engine out of habit. Probably clicking into the tools menu of their browser and selecting it as the default opening page, then promptly forgetting about it. It's a service that finds sites, no more no less. Or at least it was...
These days a user is likely to type in a search query and be presented with results surrounded with an increasing number of advertisements along the top and right hand side. Google's footprint for self promotion is increasing all the time. For many terms there is not one natural search result on Page one.
Ask yourself this:
"Do I want to use a service that just presents me with its own advertisements? At what point does it stop being a search engine and become "junk mail".. Not only junk mail, but junk mail that I'm (for some bizarre reason) actively opting to have shoved in my face when there are better, free alternatives available? |
It costs nothing to switch, it takes seconds to do. And if you have no emotional involvement in a product, why would you care? Why would anybody care? Why would you continue to use a substandard product when a better one is available for free that you can switch in ten seconds?
The answer they are likely to come up with is to promote their own "+1" To give it more relevance. An attempt to get people to really engage with them on a social level and want to stay with Google for some other reason besides "habit". Because just relying on habit, while your competitors have better products and engage in a more emotional way (and swapping costs nothing) is a very dangerous place for them to be.
Ok, that's my 5 predictions. And the name "Google Pelican" hxxp://www.demondemon.com/2013/09/09/introducing-the-google-pelican-algorithm/? Wild guess. Might pop down the local betting shop to see if I can get decent odds on it though.
Scritty
The Ultimate Private Network Management,
Visualization and Automation Tool