How to recover your site from Penguin and Panda updates
- SEO |
Yes, excessive link getting practices have some effect but only in conjunction with a poorly designed site (slow loading, too many ads vs content, duplicate pages, seozied writing style). Fix the site and Judge Google will set it free.
And please be responsible. There are many who are desperately seeking this help and can be discouraged by confusing arguments and comments. If you disagree, how about taking a problematic page from your site and apply the steps below to it ? Let it run for a week or 2, then come back here to make your feeback-based comment.
As of to date, Penguin is still in test and feedback mode. This strategy is intended for the long run no matter what the algorithm's name.
This formula holds onsite factors accountable for a long list of reasons.
I started out the experiment by putting top 7 different-niche sites under the microscope.
I was convinced offsite factors have no impact (with the exception of blatant violations the kind you get a GWT notification for).
My first visit to the top 7 sites revealed they're all implicated in a blackhat seo, some on a large scale (hundreds of forum profile links, bookmarks, comments,etc).
Which is not a secret.
Every site that has attempted to rank commercially on google during the last 10 years has been engaged(in some form or other) in unatural link building and self-promotion(fake reviews, manipulated alexa ranking, etc).
What's more, in almost all niches sites in 5th and 6th positions have a higher link profile than the top 4, which told me google is simply not counting thousands of their links. Big G is not penalizing them, just ignores a portion of their links it considers shady.
When comparing those top sites to the ones that were competing with them and dropped, I noticed the following pattern. Those that dropped have a combination of the following:
1 - A higher keyword density (%7 +).
2 - A higher ratio of duplicate pages.
3 - A higher ratio of thin content above the fold.
4 - Duplicate Rss feeds.
It has to be all the above combined, not seperately. If a site only has keyword stuffing, only a specific-keyword penalty is applied not even the page itself.
Also, those top sites have:
1 - A high number of sandboxed duplicates (the ex -50 penalty).
2 - A higher number of ranking pages(apparently due to their traffic ratio).
Where did these findings leave me ?
They confirmed to me that:
1- Offsite factors are not that of a major issue as many believe.
2- Duplicate content is punishable across the board and is a factor.
3- Disparity in keyword density (keyword stuffing) is a big ranking factor.
This finding agrees with google's repeated assertion that they care more about content retrieved to their users. Duplicate and keyword-stuffed content hurt their users' experience. Links do NOT.
Working on 1 & 2 above resulted in freeing wounded sites from the sandbox, in particular the homepage. Internal pages recovered partially (resurfaced on the serps but ranked poorly).
I wondered why did homepages recovered successfuly but internal pages did not. First thing that jumped to my mind is that homepages are usually UNIQUE. Turned out (at least how I saw it), site-wide penalty still held even for cleaned internal pages so long as their sister pages are not cleaned yet. In other words, if you clean 5 out of 30 pages, the 5 will continue to perform poorly till all 30 are fixed. That's a Panda penalty refined.
To cut it short, for those trying to recover from Panda/Penguin, try this:
1- Reduce keyword density. I helped recover some sites from Panda where site owners almost gave up by this very method exclusively. Here's how you do it. First Things First:
Create a 3 columns/rows table with these entries:
Keyword | Count | Density
Now Visit one of these sites (or google for yours). I used these, their results vary but still helpful:
Keyword Density Checker - Keyword Cloud
Live Keyword Analysis
Paste your url, then copy and paste the keyword, count and density results in the table you created. Take notice of the Density ratio only, forget the other stuff.
Now go back to the checker and paste your best competitor's url (those that are still ranking).
Do you see the difference ?
Now try to match that keyword density count by reducing yours. From my observation, google compares your ratio to your market's average. 2% limit is recommended, 3% is the maximum. The lower you set it, the upper you will go in ranking. As you do this, you will notice your article is getting "more" natural (OMG, am writing for readers for the first time
Did you match it ? If yes, congrats. If not, go back to work.
Don't forget to remove any keyword stuffing in the link's title tag and img tag (google counts those too in the ratio calculation). No more than 1 or 2 words. While you're at it, remove above the fold ads (site-wide). Also check any reciprocal linking. A decent site you linked back to 2 years ago may be is now a spam/malware site popping up a download box(someone's site got filtered out simply due to this).
When you're done, go to your sitemap and update the date.
Now log to your GWT, select the wounded site, look to your left and click on Diagnostics, then then click on Fetch as Googlebot. Enter the url (the root should already be there, just add the rest). Then click submit (make sure you only submit that url not the url and its linked urls option).
Submit a reconsideration request:
docs.google.com/spreadsheet/viewform?formkey=dEVxdmdRWFJRTjRoLWZVTHZkaTBQbkE6M Q
Now wait for 3-5 days and the magic will happen (either automatically or manually, assuming that was your only issue)
If no joy yet, you need to work on duplicate pages. Google for "duplicate page checker" or use this one below:
Similar Page Checker - Duplicate content checker
If you find any, insert the rel=canonical tag in the duplicate page to point to the one you prefer ranked. Some dyanmic database-driven pages will simply not recover under these algorithms. Most those will be considered similar no matter how much you tweaked them unique. Googlebot goes beyond unique meta data, 10% different body content and analyse those pages also based on their template structure. A change in the city or product name does not make them unique. What you can do is focus on ranking those that are accepted and indexed. Hopefully you get some yahoo referrals from the others or try to recuperate their lost traffic thru DailyMotion videos.
By this stage, your waiting time will be around a week or two. Past this, you definitely have an offsite seo penalty. You need to do some whitehat link mixology to offset the negative ratio. Even if you don't have a penalty, these will boost your rankings. Proceed as follows:
1- Do some LOCAL directory linking (1 pr5 link from these can drag your site back 20 pages up). By local, I mean country-specific. Don't waste your time on any other.
2 - Do MANUAL social bookmarking (SU and Digg are the best - write real reviews, google ignores 1 line comment of "Great read, thank you for the share).
3 - Do some article marketing. I recommend hubpages but they should be real articles that are highly DIFFERENT from those already indexed. Penguin can detect rehashed information and will drop the article even it got indexed the first time.
4 - Do not hurt your baby article with backlinking in its probationary period(first 2 weeks).
Only tweet and facebook it on your own accounts. Then make sure someone else social bookmark it on SU and Digg. Video would be a grea extra. That's it. No Forum profiles, no Angela & Paul and no Fiverr.
5 - Tweet your links more with variations of same achor text. Do same thing with Fb.
6 - Blog commenting on AUTHORITY blogs in the same niche. Blog commenting does boost your site ranking when credited. But only comments on authority blogs in the same niche will count. 50.000 automated comments will NOT.
Why is commenting important in recovery and ranking ?
It creates you AUTHOR CREDIBILITY with google who is more interested in you as a blogger than your post or link. It is also a social signal of your interactivity. Engaging in commenting validates you as an interesting source of information. This formula would be lame if I kept any how-to. So am gonna share with you the exact commenting technique I deployed. And, of course, I tested it before implementing it.
How did I test it ?
I set up a free blog and decided to rank it on twitter, facebook and commenting only. I used twitter and facebook at first with another blog that did not cut it. So here you go:
1 - Share your article's link and a good snippet on Twitter MANUALLY (No url Shortner).
2 - Ping your twitter'rss. Do not log out from Twitter.
3 - Share same article's link and a good snippet on facebook fan page MANUALLY (No url Shortner).
4 - Ping your fb fan page'rss. Log out from fb.
5 - Go back to twitter. If you are following good content providers, look for the best tweets and
retweet them. If not, type your niche in the search box on top, follow some good brains and
retweet their stuff.
6 - Follow the tweet link and post a comment signing as Twitter.
7 - Post a good comment with NO LINK whatsoever(your twitter's url should be loaded along your
photo when you signed in).
Now google will follow your twitter address back to your original tweeted post (and those retweets).
Google will credit your Twitter profile in its own way (pretty valuable).
See the difference ?
You get credit for yourself as a twitterer (thru your followed twitter profile). In other words, you're reconciling with google and establishing "author credibility" . As a consequence, your entire blog will rank. By the way, the 2 weeks old test blog is sitting now on page 1 (6th position).
Before commenting, always remember that google assumes you're a bot till proven innocent. Therefore, to make the comment count, prove you're not a bot by referencing words, phrases or concepts from the blog post. Basically, do what automated commenting can't. Instead of :
Great article, thanx
try something like:
Did not realize that Personal Branding can be that beneficial to one's business. Thanks for the heads up
Not only the latter comment will likely be approved by the blog owner, googlebot will realize non-humans cannot relate intelligently to a blog post they don't read. Also, when was the last time
you heard a blog owner disapprove a Twitter signed-in comment ?
I'd like to remind you this recovery method is a per-page process. Recovering the homepage (which is extremely important) will not recover internal pages if they,too, have issues.
Lastly, take a break to refuel then simply rinse and repeat the above steps and you should be able to clean everything.
The process above is based on actual work done on wounded established sites that DID recover.
Hope this helps.
N.B.
Since Penguin algo is still under tweaking, there may be some false positives as well as floating spam. Google has a form where you can report those poor pages that should be caught but weren't:
https://accounts.google.com/ServiceL...?hl%3Den&hl=en
-> VISIT www.1UP-SEO.com *** <- Internet Marketing, SEO Tips, Reviews & More!! ***
*** HIGH QUALITY CONTENT CREATION +++ Manual Article Spinning (Thread Here) ***
Content Creation, Blogging, Articles, Converting Sales Copy, Reviews, Ebooks, Rewrites
I love warriorforum. Computer Tutorials
Keyword Research - Competitor Analysis - Rank Tracking - Keyword Difficulty - White Label Reporting
Software & Mobile APP Developer
Buy Spotify, Facebook Bot & IG M/S Method
OFFLINERS, Start using this simple technique and these 6 "weapons" today to get more clients and skyrocket your conversions! - FREE, no opt-in.
Make some money by helping me market this idea.
"Case Study: Discover You Can Make $1371.66 With A Simple Blog Post by Clicking Here"
You don't have to be great to start, but you have to start to be great ~Zig Ziglar~