Over the last couple of years, the whole SEO process had undergone drastic changes. Even now, things have been changing too fast for us to follow and adapt. Recently I read somewhere that google now flags website by the no: of bad links that it has. I was thinking how to find the bad links of my website. In the recent penguin update, this has proved to be crucial.
For instance, let's say I have a link with the same keyword from http://abc.com on *every* page of the site -- that's called a site wide link and it's considered bad. There are tons of examples like these.
How would you deal with such issues? If the business has to be maintained, then we have to deal with this now before google undermines it with its new algorithms. Is there any tool or something? I found Link Detox, but it's kinda unfeasible for me at the moment to shell out that much money.
Please suggest a few methods, fellow warriors.