How To Disavow Links

by Banned 59 replies
69
Disavowing ain't an easy proces, it takes time and knowledge.

With this guide we try to make your life a little easier by showing you how to get rid of the most spammy links.

Before disavowing I would recommend you first try to get rid of the links yourself.

Did you buy spammy links before? Ask your provider to remove them.

However most of the time they used automated tools and won't be able or unwiling to remove them but at least it's worth a try of course.


Get ready, here we go:

- Sign up to Google Webmasters if you haven't done yet
- Download all the links from Google Webmasters
- Load all the links in Scrapebox
- Run a Pagerank check on them
- Sort them on pagerank from high to low
- Go through all the links with PR manually
- Click "Remove/Filter"
- Select "Remove urls containing"
- Type in the domain name of the high PR links that you do NOT want to disavow


Why? Cause we are trimming the list as much as possible so that in the end we can send over the whole list to Disavow

In other words we are keeping the links that we want to disavow.

It's important to only type in the domain name so that inner pages remain.

You know when you place a high PR blog post there are more pages created then just the root url, and you want to keep those.

The high PR links that you want to keep in your disavow list are blog comments and other spammy url's with tons of OBL.

When all went well you should now be left with a list of links that are all PR0 or PR n/a, and also some higher PR's but you already checked them and decided you want to disavow them.

To make your job a little easier you can also:

- save all your high PR links in a text file first
- Reload that list in Scrapebox to replace the current list
- Launch up the Add-on named "Scrapebox Outbound Link Checker"
- And check all those URL's for amount of OBL
- Get rid of the ones with >100 OBL
- But remove them by loading the original list again
- Use the "remove/filter"---> "Remove url's containing" same like I explained above

Ok, now we have a list of mainly PR0 and PR n/a links left.

Download Netpeak Checker from the web

Load the whole list in there

Select filters:

- PA
- Trust Flow
- MozRank
- MozTrust
- Googe Indexed

And click START

The thing is Google hasn't updated PR for a long time so it could very well possible that many of your PRn/a and PR0 links actually cary quite some value.

Now what I would do is remove the links that you want to keep, so that when you start exporting the list you only keep the ones that you want to disavow.

I would say keep the links with a trust flow higher then 3 and a PA of 10+ ( these are just some numbers, fill in as you like of course)

Now run the filtered list through Scrapebox again and use the outbound link checker and get rid of the ones with many OBL.

There's a very good chance that there are still spammy links in the ones that you removed based on those filters so this is not a 100% bullet proof method of course.

However the majority of your spam links should still be in your list and it's not always needed to get rid of every single link.

As a last check I would go manually through your list to see if there are any niche relevant links among them.

Other links you might want to keep are natural links like:

- Facebook / Twitter / Pinterest and such
- Wordpress / Tumblr / Squidoo (assuming you used unique content)
- Niche relevant links
- Digg / Delicious / FolkD (asuming you used unique content)
- Youtube / Metacafe / Vimeo
- Slideshare / Docstoc (asuming you used unique content)
- and some more links at the top platforms that you put time in before

Sure those links are weak, but if you used unique content and made them yourself it makes no sense to disavow them, it's only natural to have some of those links.

Another step that you should actually do is before you remove the links you want to keep, that you copy them to a separate text file.

Then load them up in Spyglass and check the anchor text of the links.

In case you used many of the same anchor text you might want to get rid of some more links to prevent over optimization.

Some other links you might want to disavow are sitewide links (in the 100's for example) that use exact anchor text but not before you make an attempt requesting the webmaster in question to change the anchor text to your URL.

Additionally you can also do a quick PR check based on the root domains and perhaps you find some links at PR7+ platforms (which indicates at least that it's a high quality site that you have a link on) that you want to keep, all the real spammy platforms like the 100's of bookmarks blasts, or the 1000's of forum profiles are mostly based on PR2, PR3, PR4, PR5 domains and such, not really the top of the food level.

Are you missing something in this guide or don't you agree with certain parts? Let use know so we can all get wiser from it.

This guide is meant to get rid of the most spammy links, I don't think there's a perfect way to disavow, especially not in a full automated way but leave your comments so we can only make it better.
#search engine optimization #disavow #links
  • A bit of content in this section for once. Great stuff. Although, GWT links are not showing for me, but i have links that show in ahrefs. Can you take links from other progammes and load them into the tool as well? Or does it only work with links seen in GWT?
    • [1] reply
    • Banned
      GWT is a bit slow with showing links usually. so is it very recent that you added these links?

      You can first export all links from Ahrefs, Majestic, OpenSiteExplorer (if you have subscriptions there), then load them into Scrapebox and remove duplicate url's.
  • As Google has stated time and time again, only use disavow as a last measure. That means you need to try and get links removed multiple times, and provide Google proof before working with the disavow tool

    I have a friend whos submission look like they have been banned because he just kept submitting disavows with no actual link removal. I believe Matt Cutts talked about that somewhere in which disavows without proper work can lead to being banned from the tools. AKAA they wont even process teh submission.
    • [1] reply
    • Banned
      Google removed the option to file for reinclusion when not having received a manual penalty message in webmasters.

      So how are we supposed to deliver the proof?

      You got to know that Google doesn't send out messages in webmasters too often, most penalties are algorithm based.
      • [1] reply
  • One disavow file for 1 site/domain.
  • Sorry Dennis but..
    I'm yet to hear that Google are making their decisions of an un-natural link algo filter or manual spam action, based on the PR, PA and DA of the inbound links.
    The vast majority of created pages have no authority at all. But yet make up most of the web. And to pigeon hole all those properties into the "bad neighbourhood" section of the web is just an insane piece of advice to give.
    • [ 4 ] Thanks
    • [1] reply
    • Banned
      That's why I suggested to keep the links at popular platforms.

      You can easily run them through scrapebox to check the PR of the root domain and keep the ones at PR6+ sites for example (unless it are spammy blog comments or profile links at totally non related sites of course).

      Unless you count all the web2.0 sub domains most pages do have some sort of authority as they connect to pages with PR.

      Especially at this forum where most blasted links with software it's an easy way to get rid of the majority.
      • [1] reply
  • Banned
    I'm not getting into the PR update issue (I just don't care), but there's more to SEO than PR.

    Are you saying you would turn down a link on a 100% relevant page, even If that page was a PR0? Looks like a bad move IMO.

    This is where those DIY moz metrics fall on their face, they can't gauge common sense.



    Also...

    You ever looked at Google SERPs & seen that Similar link (related:domain.com (screenshot below))?

    Just because a page has a lot of outbound links doesn't necessarily mean it's a bad thing as long as the majority of those links are pointing to other domains in the same niche.

    If a PR0 page has a bunch of outbound links pointing at same niche sites, that page might be spread thin on PR but it can still help associate your page (your link on the same backlink page) with all those authority sites. It's the SEO equivalent of rubbing elbows with the rich & famous (authority links/pages).



    • [1] reply
    • Banned
      Where did I say that? I think I said this:

      "As a last check I would go manually through your list to see if there are any niche relevant links among them."
  • This whole process sounds kind of like spamming the Disavow Tool to me.

    I don't think you should use automation to identify spammy links. It takes more than a few metrics to identify a spammy link. Each page with a link should be visited to determine if it is worth keeping or not.

    Also, when you submit a list of links to the Disavow Tool you should detail the efforts you have made to remove each link.

    For example, include a comment like...

    or

    If you are not doing that, it is much more likely your Disavow request will be thrown into the bin of the ones that get ignored.
    • [2] replies
    • Banned
      Dude freaking wake up.

      Why you think the module automatically recognizes the url's or domains, because they go through it manually? :rolleyes:

      It get's crazier by the day on this forum.

      I understand that people like to preach 100% whitehat to be able to charge more.

      I also understand why people preach to go through each link manually as that takes many hours and thousands of dollars can and will be charged for that.

      But keep that nonsense for your clients with the money burning in their pockets.

      Hello, this is an SEO forum full of starters/hobbiests, the majority here isn't running a large company where a few thousand dollar more or less doesn't matter!


      ps: That whole reporting thing applies when you file for reinclusion cause you got a manual penalty and a message in your webmasters dashboard.
      • [1] reply

    • Ultra newb question here... Where do you find the webmaster of different sites? whois.com?

      I had previously purchased some link building plans and turned out that I got dinged pretty hard with penguin 2.1. GWT is not showing ANY links to my page, but ahrefs is. I do have the reports of where the links were placed, but still dont know how to find the webmaster contact info.

      Thanks in advance.
      Dave
      • [2] replies
  • I'm not going to argue with you. You do whatever you want.

    I simply offered another view on it because I think you are leading people down a wrong path by recommending that they automate the process, and especially recommending anyone use a few useless metrics (like anything provided by SEOmoz) to identify toxic links. They are likely to end up disavowing some links that are worth keeping.

    My suggestion has nothing to do with prices for work or doing anything whitehat. I don't take on link cleanup projects for precisely the reason that they are entirely too time consuming to do properly.
    • [ 2 ] Thanks
    • [1] reply
    • Banned
      Why you even make the effort when you're so sure that Google will ignore the disavow report when you don't document it in detail?
      • [1] reply
  • I think this is where we all need to make sure we are tracking the links that we create. I have spreadsheets with every link my team creates for every client we work on. For a new client we pull existing links and we have a separate spreadsheet for those so that I know where I begin and end. Then I at any point can feed both of these lists to scrapebox along with the results of ahrefs and see which links are new and not created by me (remove duplicate URLs.) Those would be the only links that would be wild cards, those are the only ones I'd have to worry about because I build good links.

    Now if you are dealing with a client that has come to you with a messy link profile or even a penalty then you are dealing with a different kettle of fish. I HAVE to look at those links, I have to look at those pages. Something could have a PR6 on it but be a spammed out nightmare. I mean people buy dropped domains and put all kinds of garbage sites on them. I don't want those links, nobody should want those links.
    • [ 2 ] Thanks
  • Please do not talk about disavowing links with me . 7 months in this company Yukon and I've gone through 300,000 effing links. One by one by one by one by one LOL. That's NOT the only thing that I do, but it is part of the maintenance that I have to do on a monthly basis :O.

    E-mail requests to remove the link, follow-up e-mails and THEN disavow them. All done manual :OOOOOO
    • [2] replies
    • Banned
      The links you showed me were messed up internal links. I don't remember any external links.
      • [1] reply
    • Banned
      Pfff...

      Now I understand why you always mention the models at your work, without them <fill in yourself>
      • [1] reply
  • What Mueller basically said was that if there was a domain containing a lot of spammy links to your site, just use the domain operator. They were seeing many people list every single link from a single domain, one by one, when they could have just disavowed the whole domain.
  • I'll add my 2 cents.

    Don't worry about disavowing nofollowed links.
    • [1] reply
    • Yeah, that's not true either. Matt Cutts said there are cases where they can view nofollow links as spammy too, and they can negatively impact your rankings.
  • I guess Google are not united on that issue then.

    John Mueller says not to worry about them.

    Do nofollowed links need to be included in your disavow file? Do they affect Penguin?
    • [1] reply
    • I had not seen that. Matt Cutts mentioned it in a video recently. This video was from May, so maybe they changed their stance on them. Matt didn't say anything about the Disavow Tool. He just mentioned that they could actually hurt rankings, which many people thought they couldn't. That was why I have been throwing nofollow links in disavow reports.

      Oh well. Guess it can't hurt to go ahead and disavow them.
  • Banned
    He said: "for example"
    • [1] reply
    • Yeah with no other examples.

      Either way your method is going to lead people to disavow links that are actually helping them. The opposite is true too. They might be leaving links that should be disavowed.

      Things like PR, DA/PA, CF, and TF do not determine if a link is toxic or not.
      • [2] replies
  • But as far as the internal links goes, and the on-page content of our site, I've been going back and forth with the tech team. Who would have thought that we'd have software engineers for a clothing company lol. We use a platform called https://jira.atlassian.com/ to setup tasks. We've changed a lot of things except for the URL structure (dynamic at the moment) which needs to be changed. But we're redesigning the whole site by dec. Crazy how we generate that much income on a crappy ecommerce site.

    Anyways, sorry for getting off topic folks lol.
  • Banned
    I'm pretty sure it's just a marketing tactic from Google to make it sound like it's not as easy as just uploading a list of url's and call it a day.

    Same like they always said that negative SEO wasn't possible.

    For sure that Google ain't gonna check every disavow report.

    Recently I did a disavow report for my own site, we'll see what happens.
    • [1] reply
    • Banned
      Why disavow your own links, unless you use crappy links?
      • [1] reply
  • You think the valuable (high PR/DA/PA/TF whatever) paid links aren't the ones actually causing your site issues? What about the ones with ridiculous anchor text abuse?
    • [1] reply
    • Banned
      Maybe people should learn to read.

      All these comments about niche relevant links and over abused anchor text links are all addressed in the OP.
  • That's got nothing to do with what I said and the simplicity of the situation implied in the OP shows why sites get penalised in the first place.
    • [1] reply
    • Banned
      This is what you said:

      "What about the ones with ridiculous anchor text abuse?"

      And exactly that is addressed in the OP so why are you asking when it's already explained in the guide?

      The first part of your question I simply ignored cause every single day I prove that you can still rank very well solely based on PAID links at high DA/PA/PR etc pages. Google has a lot of work to do before they manage to catch up with that as they aren't even close to where they want to be.

      Funny that you question high PR links while at the same time selling them in your sig.

      This forum is freaking nuts!
  • haha I don't question that they work at all but I also know they can get a site penalised and your advice will not disavow those very links that will be giving the user the penalty.

    Why does Google need to penalise the links that don't work but allow the ones that do? Your logic is flawed.
    • [1] reply
    • Banned
      I have quite some affiliate sites and it are definitely not award winning ones and they only receive high PR links from my own network but I never saw any of them getting penalized.

      If you really think that those very links will be giving the user a penalty, then why do you sell them? Makes no sense to me, unless you have churn & burn customers.

      Google's algorithm is obviously not very able to flag those high PR links as unnatural.

      And all those crap links do work to a certain extend or Google wouldn't penalize sites based on such automated profiles. The thing is you need way too many of them, and if you disavow the handful or a dozen of those by accident the effect is hardly noticable.
  • Quick question.
    Has anyone seen any good results from submitting a disavow file to Google?
    Like has anyone seen their rankings restored?
  • Thank you for the info...

    One last question.... do I need to send an email for each specific link or can I request it be taken off their domain?

    i.e. zzzzzz.com is the referring domain, but there are 10 links somewhere on that domain; zzzzzz.com/blabla/blogcommentX, etc. etc.

    Separate or together?

    Lastly, why aren't any of them showing up in GWT?

    _Dave
    • [1] reply

    • I would send one email with all of the URLs listed. No need to bombard them with a bunch of emails.
      • [ 1 ] Thanks

Next Topics on Trending Feed

  • 69

    Disavowing ain't an easy proces, it takes time and knowledge. With this guide we try to make your life a little easier by showing you how to get rid of the most spammy links.