I was just reading an article over SEOMoz about Google's recent algorithm change, which affected a lot of marketers around the world. I was trying to figure out from last one month as to what has actually happened and how it affected a whole lot of websites. Here I am posting the article for your information. I am sure you'll love it. Here you go...........
By now, everyone in the SEO world is aware of the algorithmic update Google launched last Wednesday, February 23rd. Several posts on the topic are worth reading, including Danny Sullivan's take, Aaron Wall's assesment, SearchMetrics' analysis and Sistrix's data-driven post.
Here at SEOmoz, we've been analyzing the shift with help from our friends at Distilled, staff research scientist Dr. Matt Peters (whom you may remember from our Google Places analysis and who's now joined our staff full time - welcome!), and several other contributors. While there's no way to be precisely sure what Google changed to impact "11.8%" of queries, we've got some ideas that fit a number of the data points and we hope to contribute to the discussion on the topic and help search marketers gauge the update's impact on their own sites.
What Factors Could Have Caused Lost Rankings?
In reviewing the sites that got hit, we were struck by a few interesting, potential culprits.
An eHow page on the left-hand side and an EzineArticles page on the right
- It seemed that sites whose pages had fewer and/or less intrusive blocks of advertisements on them tended to be in the winner bucket, while those with more and more intrusive advertising tended to be in the loser group.
- Likewise, sites whose UI/design would likely be described as more modern, high quality, thoughtful and "attractive" were winners vs. the "ugly" sites that tended to be in the loser bucket.
- When it came to user-generated-content (UGC) sites, those that tended to attract "thin" contributions (think EzineArticles, Hubpages or Buzzle) lost, while those with richer, often more authentic, non-paid, and not-intended to build SEO value or links (think Etsy, DailyMotion, LinkedIn, Facebook) won.
- In the "rich content" sector, pages with less usable/readable/easily-consumable content (think AllBusiness, FindArticles) tended to lose out to similarly content-rich sites that had made their work more usable (think LOC.gov, HuffingtonPost)
- User/usage data - signals like click-through-rate, time-on-site, "success" of the search visit (based on other usage data)
- Quality raters - a machine-learning type algorithm could be applied to sites quality raters liked vs. didn't to build features/factors that would boost the "liked" sites and lower the "disliked" sites. This can be a dangerous way to build algorithms, though, because no human can really say why a site is ranking higher vs. lower or what the factors are - they might be derivatives of very weird datapoints rather than explainable mechanisms.
- Content analysis - topic modeling algorithms, those that calculate/score readability, uniqueness/robustness analysis and perhaps even visual "attractiveness" of content presentation could be used (or other signals that conform well to these).
Here is the source of the article: Google's Farmer/Panda Update: Analysis of Winners vs. Losers | SEOmoz