by seoquicktop Banned
8 replies
  • SEO
  • |
Dangerous Links



I want to discuss in this article a particular metric that helps to determine when the search engines penalize the websites in their search results. The metric is known as a ratio of bad to good links. The level of bad links is determined based on how the site is visible in the search results and how much spam flows through the site. The metric's value does not exceed two for the very reputable sites, that don't work on their link building profile, either directly or indirectly. If they do conduct public relationships on the Internet, one way or another, this value increases to six. The value in the range between 10 and 40 is a risky zone. When the value exceeds 40, a website is definitely penalized by the search engines. Although this metric is not well researched, it still makes sense, and it seems to be quite promising. It shows the volume of the bad links, but it says nothing directly about the usefulness of the existing links.

When you finish scrubbing your site of the bad links, you need to keep working to increase utility of back-links by working on the brand, increasing the credibility of the site and improving the guest blogging.

Without further due, I want to dive into more details.

First, I want to discuss several reputable sites in general, then talk about the sites that are not as great, but quite respectable and which I know are not under the filter for the use of low-quality links. Finally come the sites that are under the filter. The difference in the value of the metric is striking. Methods for calculating the danger zone is based on the relationship of visibility to spam. What is that? It states that for the risk assessment of the donor domain it makes sense to consider indicators that reflect, first, the ratio of the "the world to the domain", and second, "the domain to the world." The former is characterized to some degree by the authority, force-donor domain importance in the eyes of the world and the search engines. The latter describes the domain spammed with the external links; it is possible to evaluate it through the services available online.

These characteristics are traditionally taken into account through the various kinds of integrated indicators of the "weight" of incoming links (CY, PR, Trust, etc.). I propose to use the new parameters - the number of keywords which promotes a website to the top ranking, and / or the visibility of a website in the search results. It was shown that the last two parameters are correlated with each other, and, in some cases, these parameters are more preferable than the above analogs of Trust. In my experiments I use visibility of a website and do not use the volume of keywords: I'll look into that some other time. To assess the donor domain, I use two values - its visibility in the search results and the level of spam of the outgoing links. I believe that the visibility is low, if it is below a certain threshold, I set it at 0.08. I got to it by trial and error. I believe that the spam is low if it is less than 10, otherwise it is high.

If the visibility is at acceptable levels and the donor domain is not spammed, then it's all safe. If visibility is low and the level of spam high, then it's questionable. As you can see, the domains with lots of spam are considered to be bad, but also the domains which are spammed but where the level isn't determined are also considered to be bad (the appearance of the spam level is easy to determine). Thus, all donor domains are either good or questionable or bad. As I describe in this article, I calculate the ratio of all bad domains donor among the good ones.

What is the acceptable value of the metric

Let's focus in our analysis on the profiles of well known sites with solid reputation in order to get a rough, approximate value of the metric, to understand the end goal towards which we should drive the progress of the brand and the site, and which the search engines almost certainly do not penalize. If we look at any nonprofit scientific website that do not work with the links or SEO of any kind, we'll notice that all the links are of natural origin. The typical metric for this site is around 1.5. It is interesting that even the sites with a crystal-clear reputation (in terms of backlinks), the number of bad links still exceeds the number of good ones. When making a more thorough review, a more detailed analysis of the data by domain, we find that this is partly a consequence of the imperfection of the metric and the technical tools for calculation. If the Trust cannot determine the level of spam, we assume that it's bad for reputation (and the site is spammed). A more detailed review is required at this stage to separate bad from good domains. It turns out that because of the imperfection of means of calculation the number of bad domains appears larger than it actually is. But, despite this, it can be shown that even this "exaggerated" share allows you to make appropriate conclusions - conclusions that are consistent with common sense. So what are the thresholds for sites that are not bad, taking into account the current realities of imperfect calculation? We see that for a very reputable site it is a value of about one and a half, even taking into account the fact that there are a lot of links to the site.

What will be the metric for less reputable sites?



If we turn to a website of an expert, a consultant or a professional, they have their own audience, which is fairly narrow, but, nevertheless, quite strongly involved and loyal. They don't buy any links, the natural link mass is generated by the focus audience. The metric value just a bit over 1, this is a pure laboratory case, the number of bad links exceeds the number of the good ones. This is again a consequence of the imperfection of the tools and / or algorithms involved in the calculation of the metric. The value is low, even lower than in the case of the more authoritative site which is apparently due to the small number of domains linked to the experts' sites. It is safe to assume that the level of popularity is directly correlated with the the number of bad link (and, accordingly, the number of inbound links) because of the large number of sites that re-publish everything, often automatically.

If we turn our attention to a business website operating mainly in the B2B sector, that sells mainly through personal contacts, and to a lesser extent via the Internet, we find that the value of their metrics is around 4, that is significantly higher than in the case above. However, this value, is good and it means that the search engines won't penalize the site. Most of the time these sites hire SEO consultants that buy links, but not too zealously, the links profiles are cleaned from time to time, also not too zealously. As a result, we have the regular middle-age sites with the regular history, and that are not under the filter and were never penalized. The majority of sites are like that, they have a relatively small number of links. As you might recall, the number of links increases the metric. The number of links are usually in the range between 10 and 100.
Now the fun stuff, let's consider the sites penalized by the search engines, their metric is usually over 40. That metric value is very different from other sites reviewed above by an order of magnitude, and the fact that the search engines penalize these sites make a good sense - a link profile of this quality certainly can't product relevant search results for the users.
The sites with the metric of 10 are touching the risky threshold, those over 10 and close to 40 are certainly at risk - it's very tricky to determine what the magic number is, but it's crystal clear that you don't want to find yourself in that range.


Conclusions
  • the site is very respectable, if the metric's value is less than two, and the number of links is measured in the hundreds.
  • The site is respectable, if the value less than 5, and the number of links domains measured in the hundreds.
  • If the value is less than 5, and the number of links does not exceed one hundred - site is a typical mid level, and the search engines don't penalize them.
  • If the value is less than 10, and the number of links does not exceed one hundred - the site does not seem to threatened by the filter in the nearest future, but scrubbing and / or increasing the link profile may be required.
  • the value over 10 means the site is definitely at risk. In this case, a total scrubbing of link profile is a problem number one.
  • It'd be nice to have a service that would calculate these values ​​automatically.
  • The growth in the popularity of a website increases the ratio of bad to good links, which is normal and natural, as long as it doesn't get close to 10.
  • This metric has some disadvantages, but generally it gives adequate results.
#dangerous #links
  • Profile picture of the author Volodymyr Ulitovskyy
    Banned
    author did not explain how can I run this metric on my website - is there a special online tool for that purpose?
    also, what kind of links are considered "bad" or "spam"? it would be nice to know...
    Finally, it would be helpful to provide examples of websites in different categories - maybe author can post it in his comments.
    {{ DiscussionBoard.errors[10761504].message }}
    • Profile picture of the author mommywriter
      Originally Posted by Volodymyr Ulitovskyy View Post

      author did not explain how can I run this metric on my website - is there a special online tool for that purpose?
      also, what kind of links are considered "bad" or "spam"? it would be nice to know...
      Finally, it would be helpful to provide examples of websites in different categories - maybe author can post it in his comments.
      Dangerous links have been bothering me all the way in my blogs and i was always at a loss to make out which ones could be considered as dangerous. thanks for some great guidance
      {{ DiscussionBoard.errors[10763253].message }}
  • I agree - author needs to expand on his tips with more examples and good, verified sources that can be used by beginners to improve their rankings.
    {{ DiscussionBoard.errors[10762016].message }}
  • Profile picture of the author getreal5
    good info but no backup. please expand with backups and examples proving your point
    {{ DiscussionBoard.errors[10762091].message }}
  • Profile picture of the author webby0031
    hahah more useless crap for newbys to didgest and be lead on another path to nowhere.
    {{ DiscussionBoard.errors[10762711].message }}
  • Profile picture of the author mommywriter
    Very valid and to the point conclusions. smart summing up. i am really indebted to you for adding some smart solutions to my inventory
    {{ DiscussionBoard.errors[10763266].message }}
  • Profile picture of the author mommywriter
    is there a tool that can identify the dangerous links on auto and delete them all by itself? something you can recommend?
    {{ DiscussionBoard.errors[10763286].message }}
  • Profile picture of the author paulgl
    The OP's article wall of spamzola, has nothing to do with links that hurt.

    Google fully explains what links hurt.

    Everyone and their brother would be paying/getting links on crappy sites, but they don't.

    People have no clue as to how google or the internet work. End of story.

    seoquicktop is rapidly becoming seoquickspam.


    Paul
    Signature

    If you were disappointed in your results today, lower your standards tomorrow.

    {{ DiscussionBoard.errors[10763338].message }}

Trending Topics