Google Penguin Update - 1-2-3-4-5-6-7-8-9 and 10!

7 replies
  • SEO
  • |
I expect there to be numerous updates - similar to Google's Panda Updates on this one. Using what we know about Panda - this update is very similar and should be a regular update.

Here are my great tips on how to beat the "Penguin":
  1. Remove Paid Text Links Using Exact Match Anchor Text: You must avoid sponsored or paid links that use exact match anchor tag keywords.
  2. Remove Comment Spam: Another easy footprint for Google to spot! Avoid using automated link building tools.
  3. Remove Guest posts on questionable sites: Most "private" blog networks that allow you write a guest post/article, which you would upload it, many of their networks have all been identified by Google.
  4. Remove Article Marketing Sites: Links from article marketing sites have been classified as being unnatural links by Google
  5. Avoid Links from Dangerous Sites: Sites that have been flagged for malware, numerous pop-ups, link farms or other spammy issues are another indicator causing sites to lose rankings in the Google SERPs.
  6. Focus on Social Media Links: Social Media Links have increased over 25% in estimated value to Search Engines this year alone. This is because SE's know they are from real people - for the most part.
  7. Focus on relevant Directory Links: Directories are a great way to get links - especially if they are relevant to your website. How do you know which directories to get? Simple: Just type "[your field] directories" into Google and you will get a long list of good directories to get listed in.
  8. Focus on Press Releases - Adding a photo or an image increases views of the release by 14%.
  9. Focus on sharable content on your Press Release: Adding a video gives you a 28% increase in views.
  10. Press Releases Sharable: Add a Video, Info-graphic, Image and a Download to each: Go all the way and put in a photo, a video, a graphic and a download and you'll see a 78% jump in the number of views!

source : http://worldclassmedia.com/marketing...-info-graphic/
#google #penguin #update
  • Profile picture of the author bamstk090
    Readable and Clear Navigation Hierarchy and Links:
    Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link.

    Mobile Friendly:
    Can your site be seen easily on a mobile phone? Is your site scalable? There are ways you can strip off items for cellular phones. I believe mobile-friendly sites will be the wave of the future – so start working on it now. But make sure you do not duplicate your content by creating a duplicate mobile site. The best way is to make your existing site mobile friendly or replace it with one that is.

    Site Speed:
    If you think about it – the faster your site is – the more users are capable of doing on your site – the faster they can navigate through it and the more productive your site will be. Slow sites can drive customers crazy (especially customers or clients with slow connections). In any way possible – Speed up your site so users can move around the site easily.

    Short URL’s:
    Search Engines are moving more and more towards shorter cleaner URL’s (no appended parameters and special characters in them).

    Uniformity of URL’s:
    There is no way to separate uppercase URL’s and lowercase URL’s.

    Sitemap to Users:
    Offer a site map to your users with links that point to the important parts of your site. If the site map has an extremely large number of links – you may want to break the site map into multiple pages.

    Lower the amount of links on each page:
    Keep the links on a given page to a reasonable number. Page (Under 200 is good – 100 links is ideal) – Consider Java-scripting additional links or linking higher up on the strategy:

    Relevancy – Don’t talk about irrelevant topics on your site:
    More information is better – but irrelevant information is worse. Create a useful – information-rich site – and write pages that clearly & accurately describe your content. Separate Different Subjects in different folders – If you are a Rehab Center and you are talking about celebrities – keep it in a “Celebrity-Rehab” Category folder off the main content of your site. Don’t mix “Cats” and “Dogs” links on the same page. Separate your site by subject.

    The Keywords and Top Searches in The Title – H1 and Domain name if possible:
    Think about the words users would type to find your pages – and make sure that your site actually includes those words within it.

    Try To Use Text Links & always Include Info On Each Picture:
    Try to use text instead of images to display important names – content – or links. The Google crawler doesn’t recognize text contained in images. If you must use images for textual content – consider using the “ALT” attribute to include a few words of descriptive text.

    Having Accurate Notations:
    Make sure that your elements & aLT attributes are descriptive & accurate.

    The Broken Links and Broken HTML:
    Check for broken links and correct HTML.

    Try and keep every page you want indexed as a static page:
    If you decide to use dynamic pages (i.e. – the URL contains a “?” character) – be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.

    The Domain Age:
    A really old domain is one that existed before anyone cared about SEO. If it existed before anyone thought about gaming the search engines then it’s less likely that it is currently trying to game them. Note that domain age is said to be reset whenever there is a change of ownership so a 10 year old domain that just changed hands last month isn’t going to provide as strong a signal as it did before it changed owners.

    The Shared IP Addresses:
    If an IP has multiple web sites associated with it – then it can be inferred that the web site owner isn’t paying much for the hosting service. Spammers often choose this route to keep their costs low and hence a dedicated IP signals that the owner is truly interested in a long-term – successful web presence.

    The Code to Text Ratio:
    Some Sites that contain 100 KB of HTML code with only 2 KB of content are possibly signaling a lack of sophistication and perhaps a lack of interest in doing what’s right for the user – (i.e. creating pages that load quickly and feel responsive). Since search engines want to keep their users coming back – they want to send them to sites that are going to be well-received and therefore considered a good search experience.

    Note that Rand Fishkin of SEOMoz quotes Vanessa Fox of Google and suggests that code is ignored by Google implying that this ratio doesn’t play any role at all.

    All CSS vs. Tables:
    There is a lot of debate about the advantages of CSS when it comes to SEO. For me – there are two signals here. The first is that a redesign from tables to CSS is picked up as a site-wide investment in the site. A site that is maintained & updated sends a signal that someone cares about it and therefore is worth a look by the search engines. The second signal is that CSS can improve the code to text ratio (see previous item).

    The Valid HTML / XHTML:
    The W3C makes it easy to validate a web page & ensure that it conforms to standards. Since valid web pages almost never occur without a conscious effort to make them error-free – having such pages is a signal that there is someone behind the site that is being careful with their efforts.

    The Existence of Robots.txt File:

    This file – which sits in the root folder of a web site – provides instructions to web sites about what they should and shouldn’t index. Without it – search engines are left to assume that all content is fair game. Thus – one could argue that if the file exists & explicitly permits search engines to crawl the site then if all other things were equal – the site that gave permission should beat out a site that didn’t.

    The Bounce Rate:

    It is well known that Google Webmaster Tools shows how long people stay on your site and how many people bounce. Although Google has not officially stated that this is any deciding factor at all in rankings (can you imagine what would happen if they had said this – people gaming this point – machines staying on pages hundreds of hours just to make this go up). I believe it plays a large part of your site’s SEO performance and shows a clear picture of how relevant people who go to your site stay on it. Anything you can do to improve this & increase time on your site is a good idea.
    Signature
    {{ DiscussionBoard.errors[6393311].message }}
    • Profile picture of the author Cosmit
      Originally Posted by bamstk090 View Post

      Readable and Clear Navigation Hierarchy and Links:
      Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link.

      Mobile Friendly:
      Can your site be seen easily on a mobile phone? Is your site scalable? There are ways you can strip off items for cellular phones. I believe mobile-friendly sites will be the wave of the future - so start working on it now. But make sure you do not duplicate your content by creating a duplicate mobile site. The best way is to make your existing site mobile friendly or replace it with one that is.

      Site Speed:
      If you think about it - the faster your site is - the more users are capable of doing on your site - the faster they can navigate through it and the more productive your site will be. Slow sites can drive customers crazy (especially customers or clients with slow connections). In any way possible - Speed up your site so users can move around the site easily.

      Short URL's:
      Search Engines are moving more and more towards shorter cleaner URL's (no appended parameters and special characters in them).

      Uniformity of URL's:
      There is no way to separate uppercase URL's and lowercase URL's.

      Sitemap to Users:
      Offer a site map to your users with links that point to the important parts of your site. If the site map has an extremely large number of links - you may want to break the site map into multiple pages.

      Lower the amount of links on each page:
      Keep the links on a given page to a reasonable number. Page (Under 200 is good - 100 links is ideal) - Consider Java-scripting additional links or linking higher up on the strategy:

      Relevancy - Don't talk about irrelevant topics on your site:
      More information is better - but irrelevant information is worse. Create a useful - information-rich site - and write pages that clearly & accurately describe your content. Separate Different Subjects in different folders - If you are a Rehab Center and you are talking about celebrities - keep it in a "Celebrity-Rehab" Category folder off the main content of your site. Don't mix "Cats" and "Dogs" links on the same page. Separate your site by subject.

      The Keywords and Top Searches in The Title - H1 and Domain name if possible:
      Think about the words users would type to find your pages - and make sure that your site actually includes those words within it.

      Try To Use Text Links & always Include Info On Each Picture:
      Try to use text instead of images to display important names - content - or links. The Google crawler doesn't recognize text contained in images. If you must use images for textual content - consider using the "ALT" attribute to include a few words of descriptive text.

      Having Accurate Notations:
      Make sure that your elements & aLT attributes are descriptive & accurate.

      The Broken Links and Broken HTML:
      Check for broken links and correct HTML.

      Try and keep every page you want indexed as a static page:
      If you decide to use dynamic pages (i.e. - the URL contains a "?" character) - be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.

      The Domain Age:
      A really old domain is one that existed before anyone cared about SEO. If it existed before anyone thought about gaming the search engines then it's less likely that it is currently trying to game them. Note that domain age is said to be reset whenever there is a change of ownership so a 10 year old domain that just changed hands last month isn't going to provide as strong a signal as it did before it changed owners.

      The Shared IP Addresses:
      If an IP has multiple web sites associated with it - then it can be inferred that the web site owner isn't paying much for the hosting service. Spammers often choose this route to keep their costs low and hence a dedicated IP signals that the owner is truly interested in a long-term - successful web presence.

      The Code to Text Ratio:
      Some Sites that contain 100 KB of HTML code with only 2 KB of content are possibly signaling a lack of sophistication and perhaps a lack of interest in doing what's right for the user - (i.e. creating pages that load quickly and feel responsive). Since search engines want to keep their users coming back - they want to send them to sites that are going to be well-received and therefore considered a good search experience.

      Note that Rand Fishkin of SEOMoz quotes Vanessa Fox of Google and suggests that code is ignored by Google implying that this ratio doesn't play any role at all.

      All CSS vs. Tables:
      There is a lot of debate about the advantages of CSS when it comes to SEO. For me - there are two signals here. The first is that a redesign from tables to CSS is picked up as a site-wide investment in the site. A site that is maintained & updated sends a signal that someone cares about it and therefore is worth a look by the search engines. The second signal is that CSS can improve the code to text ratio (see previous item).

      The Valid HTML / XHTML:
      The W3C makes it easy to validate a web page & ensure that it conforms to standards. Since valid web pages almost never occur without a conscious effort to make them error-free - having such pages is a signal that there is someone behind the site that is being careful with their efforts.

      The Existence of Robots.txt File:

      This file - which sits in the root folder of a web site - provides instructions to web sites about what they should and shouldn't index. Without it - search engines are left to assume that all content is fair game. Thus - one could argue that if the file exists & explicitly permits search engines to crawl the site then if all other things were equal - the site that gave permission should beat out a site that didn't.

      The Bounce Rate:

      It is well known that Google Webmaster Tools shows how long people stay on your site and how many people bounce. Although Google has not officially stated that this is any deciding factor at all in rankings (can you imagine what would happen if they had said this - people gaming this point - machines staying on pages hundreds of hours just to make this go up). I believe it plays a large part of your site's SEO performance and shows a clear picture of how relevant people who go to your site stay on it. Anything you can do to improve this & increase time on your site is a good idea.
      I don't know where you get this information from but it sounds to me like a great way to confuse people and get them away from SEO.

      Mobile doesn't matter for general search because my computer isn't using Google Mobile. If I were optimizing for phones, then it would matter.

      Google doesn't favor short URLs or uniformed URLs. Matt Cutts confirmed it.

      Lowering the amount of links on a page only means the pages that you do link to get more juice while the other ones do not. There is no magic number of how many links you need to have - every website is different.

      Relevancy - Don't mix cats and dogs links on the same page? Really? You're going to go THAT FAR?

      Shared IP doesn't indicate spammers. 99% of sites are on shared IPs. Because people don't have $200/mo to fork out for dedicated servers. Those are for businesses.

      Bounce rate isn't a problem if you don't install Google analytics or adsense. Google doesn't even use that information to determine algorithm. What if my site provides definition for words and someone went on it, got the definition in 15 seconds, and left?

      Code to text ratio? css? javascript? html/xhtml?

      Cmon, really?
      {{ DiscussionBoard.errors[6395547].message }}
      • Profile picture of the author StoneWilson
        Originally Posted by Cosmit View Post

        I don't know where you get this information from but it sounds to me like a great way to confuse people and get them away from SEO.

        Mobile doesn't matter for general search because my computer isn't using Google Mobile. If I were optimizing for phones, then it would matter.

        Google doesn't favor short URLs or uniformed URLs. Matt Cutts confirmed it.

        Lowering the amount of links on a page only means the pages that you do link to get more juice while the other ones do not. There is no magic number of how many links you need to have - every website is different.

        Relevancy - Don't mix cats and dogs links on the same page? Really? You're going to go THAT FAR?

        Shared IP doesn't indicate spammers. 99% of sites are on shared IPs. Because people don't have $200/mo to fork out for dedicated servers. Those are for businesses.

        Bounce rate isn't a problem if you don't install Google analytics or adsense. Google doesn't even use that information to determine algorithm. What if my site provides definition for words and someone went on it, got the definition in 15 seconds, and left?

        Code to text ratio? css? javascript? html/xhtml?

        Cmon, really?
        Agreed. Many of OP's "tips" sounds unreasonable and I really doubt that after we remove so many "questionable" links what we will get. Deindexed I guess?
        Signature
        Looking for godaddy renewal coupon? Check GodaddyRenewal.com!
        {{ DiscussionBoard.errors[6395912].message }}
    • Profile picture of the author webmash
      Hi

      I agree with your points... but in between, i got some confusions like :

      1. Remove article marketing sites = Top Article sites are valuable for Google.
      2. Guest posting in questionable sites

      Can you provide us little clarification of these things?

      Still, thank you for your help to everyone.

      Natasha
      {{ DiscussionBoard.errors[6396199].message }}
  • Profile picture of the author aaron86
    I have nothing to post in this forums, you have share above the information that are very important to beat penguin all I can say is thanks for posting your very informative information.

    Aaron
    {{ DiscussionBoard.errors[6395425].message }}
  • Profile picture of the author bamstk090
    i just found that from Google Penguin Update – 10 Website Optimization Technical Tips [+ Info-Graphic]

    some theory that i have never before, so i share at warriorforum
    so we can discuss it

    thanks
    Signature
    {{ DiscussionBoard.errors[6396763].message }}
    • Profile picture of the author beauxesprits
      I will just say.... Share your own thoughts..... DO NOT COPY PASTE.... and yeah DUMBA$$ its a rotten tip that is shared....
      Signature

      Pay for Performance SEO Projects available.

      {{ DiscussionBoard.errors[6396797].message }}

Trending Topics