Hiding Private Networks From Open Site Explorer, Majestic, Ahrefs & Spyglass?

97 replies
  • SEO
  • |
Can anyone advise on how to prevent the main backlink checkers from picking up and displaying the backlinks that you build from sites in your private network?

I guess it's something to do with setting up a robots.txt file that blocks their spiders, however, I can't work out how to do that.

Any advice appreciated
#ahrefs #explorer #hiding #majestic #networks #open #private #site #spyglass
  • Profile picture of the author seobuddy
    Backlinks are out of your control, unless you have admin access to them!
    {{ DiscussionBoard.errors[5957235].message }}
    • Profile picture of the author rob1123
      You would need to figure out the IP addresses of the robots which you want to block and black list them, If i'm right (don't know if I am BTW) but all of this can be done in cpannel using the ip deny manager and possibly the raw access logs.

      Like I say, having never done this before I'm not 100% sure. If you want to create a robots txt. file webmaster tools has a 'wizard' to do so.
      Signature

      derp.

      {{ DiscussionBoard.errors[5957329].message }}
  • Profile picture of the author Mike Anthony
    Heres a brief intro. Not too hard

    The Web Robots Pages

    and then you just need the robots name to disallow. Ummmm... I can't remember them all now guess I need to go back and watch one of my videos. lol

    I know I hadn't found one for spyglass yet (their new one) and that Majestic gives the name out - I THINK its MJ12bot. ahhh... and opensite explorer is rogerbot

    Hope that helps - good move blocking those.

    and yes guys it not only can be done its a standard practice.
    Signature

    {{ DiscussionBoard.errors[5957278].message }}
    • Profile picture of the author mark@1to101
      Originally Posted by seobuddy View Post

      Backlinks are out of your control, unless you have admin access to them!
      I'm talking about backlinks from sites that I own that are linking to other sites that I own and to clients' sites, so I do have access to the admin side of them.

      Originally Posted by dp40oz View Post

      Im not sure it can be done.
      I'm not 100% sure that it can be but I thought that I had heard it mentioned before.
      {{ DiscussionBoard.errors[5957292].message }}
    • Profile picture of the author mark@1to101
      Originally Posted by Mike Anthony View Post

      I know I hadn't found one for spyglass yet (their new one) and that Majestic gives the name out - I THINK its MJ12bot. ahhh... and opensite explorer is rogerbot
      Thanks Mike

      So, it would be, for example,...

      User-agent: rogerbot
      Disallow: /

      ...to exclude specific backlink checkers. Or, how about excluding all robots except Google, Yahoo and Bing by using something like...

      User-agent: Google
      Disallow:

      User-agent: *
      Disallow: /

      ...then it wouldn't be necessary to know the specific bots for all of the backlink checkers.
      {{ DiscussionBoard.errors[5957368].message }}
      • Profile picture of the author mark@1to101
        I just did some research on the different bots that the backlink checkers use and Mike was 100% right with the OSE (rogerbot)and Majestic SEO (MJ12bot) ones.

        I also found out that the bot for Ahrefs is AhrefsBot and that these...

        dotbot
        gigabot
        exabot

        ...have also been associated with Linkscape, so it might be worth blocking those too.

        I couldn't find the name of the SEO Spyglass bot so I've opened a support ticket with them to ask them what it is. Maybe they'll tell me and maybe they won't.
        {{ DiscussionBoard.errors[5958033].message }}
        • Profile picture of the author mark@1to101
          For anyone who wants to do this, I think this is the text that you need to put in your robots.txt file...

          User-agent: *
          Disallow:

          User-agent: rogerbot
          Disallow: /

          User-agent: exabot
          Disallow: /

          User-agent: MJ12bot
          Disallow: /

          User-agent: dotbot
          Disallow: /

          User-agent: gigabot
          Disallow: /

          User-agent: AhrefsBot
          Disallow: /

          ...and then save the file and upload it to the root directory.

          Doing so should stop your network sites showing up in OSE, Ahrefs and Majestic SEO if competitors run a backlink analysis on your money sites or your clients' sites. I'll let you know if I find out the name of the SEO Spyglass bot.
          {{ DiscussionBoard.errors[5958560].message }}
          • Profile picture of the author dp40oz
            Originally Posted by mark@1to101 View Post

            For anyone who wants to do this, I think this is the text that you need to put in your robots.txt file...

            User-agent: *
            Disallow:

            User-agent: rogerbot
            Disallow: /

            User-agent: exabot
            Disallow: /

            User-agent: MJ12bot
            Disallow: /

            User-agent: dotbot
            Disallow: /

            User-agent: gigabot
            Disallow: /

            User-agent: AhrefsBot
            Disallow: /

            ...and then save the file and upload it to the root directory.

            Doing so should stop your network sites showing up in OSE, Ahrefs and Majestic SEO if competitors run a backlink analysis on your money sites or your clients' sites. I'll let you know if I find out the name of the SEO Spyglass bot.
            Any chance Mike can confirm this is correct? The idea is cool but I am wondering if this will lead to a pretty big footprint.
            {{ DiscussionBoard.errors[5958887].message }}
            • Profile picture of the author RevSEO
              Originally Posted by dp40oz View Post

              Any chance Mike can confirm this is correct? The idea is cool but I am wondering if this will lead to a pretty big footprint.
              Chances are there are other elements that will lead to a bigger footprint than that. Unless you are building out a network that's Google's has its eyes on, they won't utilize an Robots.txt file analysis for Disallows like that. It sounds like this poster is keeping his network private, and he should be fine. Chances are there's a bigger footprint somewhere else on his network than this Robots.txt modification.

              Having said that, obviously don't disallow Google and other SE bots, since after all that's why you are creating the network in the first place.

              Either way, great thread, if you run a private network this should be implemented.
              {{ DiscussionBoard.errors[5959080].message }}
            • Profile picture of the author mark@1to101
              Originally Posted by dp40oz View Post

              The idea is cool but I am wondering if this will lead to a pretty big footprint.
              It is a footprint, but then not doing it makes it easy for competitors to find your network and submit a spam report on your sites. So, taking a chance either way I think.
              {{ DiscussionBoard.errors[5959122].message }}
            • Profile picture of the author Mike Anthony
              Originally Posted by dp40oz View Post

              Any chance Mike can confirm this is correct? The idea is cool but I am wondering if this will lead to a pretty big footprint.
              I guess you are talking about the idea but yes its totally cool and perfectly acceptable. You have the right to limit your bandwidth and protect your site from bots. Its also smart because if a competitor cannot see your backlinks he has little to report to Google. To answer Mark's other questions I tend to not like universal blocking. It can have some unintentional consequences blocking crawls that you may want plus if theres ever changes in a bots name (say Google's) your site will drop out of the index almost as fast as a deindexing and if you don't know or remember your robot.txt at the time you could have a heart attack . Still I know people who advocate it.
              Signature

              {{ DiscussionBoard.errors[5959668].message }}
            • Profile picture of the author MarQueteer
              Originally Posted by dp40oz View Post

              The idea is cool but I am wondering if this will lead to a pretty big footprint.
              I doubt it, but instead of solving this via the robots.txt, it isn't rocket science to block certain bots via .htaccess instead.

              The advantage: It's a real block, no matter if the bots respect your robots.txt or not (many bots simply ignore the robots.txt and still crawl everything) and noone except your server can read it.
              {{ DiscussionBoard.errors[5959914].message }}
              • Profile picture of the author mark@1to101
                Originally Posted by MarQueteer View Post

                I doubt it, but instead of solving this via the robots.txt, it isn't rocket science to block certain bots via .htaccess instead.

                The advantage: It's a real block, no matter if the bots respect your robots.txt or not (many bots simply ignore the robots.txt and still crawl everything) and noone except your server can read it.
                I just looked into this and it does seem that blocking access via .htaccess is better.

                I think it would look something like this...

                RewriteEngine On
                RewriteCond %{HTTP_USER_AGENT} ^rogerbot [OR]
                RewriteCond %{HTTP_USER_AGENT} ^exabot [OR]
                RewriteCond %{HTTP_USER_AGENT} ^MJ12bot [OR]
                RewriteCond %{HTTP_USER_AGENT} ^dotbot [OR]
                RewriteCond %{HTTP_USER_AGENT} ^gigabot [OR]
                RewriteCond %{HTTP_USER_AGENT} ^AhrefsBot
                RewriteRule .* - [F]

                ...though I may be wrong.
                {{ DiscussionBoard.errors[5960088].message }}
                • Profile picture of the author Mike Anthony
                  Originally Posted by mark@1to101 View Post

                  I just looked into this and it does seem that blocking access via .htaccess is better.
                  may very well be - I have never looked into it much. I don't think I will be pushing that one too much to people though because you can really screw up a site bad within htacess and I don't want the headaches
                  Signature

                  {{ DiscussionBoard.errors[5960173].message }}
                • Profile picture of the author terminator-tobe
                  Originally Posted by mark@1to101 View Post

                  I just looked into this and it does seem that blocking access via .htaccess is better.

                  I think it would look something like this...

                  RewriteEngine On
                  RewriteCond %{HTTP_USER_AGENT} ^rogerbot [OR]
                  RewriteCond %{HTTP_USER_AGENT} ^exabot [OR]
                  RewriteCond %{HTTP_USER_AGENT} ^MJ12bot [OR]
                  RewriteCond %{HTTP_USER_AGENT} ^dotbot [OR]
                  RewriteCond %{HTTP_USER_AGENT} ^gigabot [OR]
                  RewriteCond %{HTTP_USER_AGENT} ^AhrefsBot
                  RewriteRule .* - [F]

                  ...though I may be wrong.
                  Do you know if this code works (ie the correct syntax)?
                  {{ DiscussionBoard.errors[6455940].message }}
                • Profile picture of the author squadron
                  Originally Posted by mark@1to101 View Post

                  I think it would look something like this...

                  RewriteEngine On
                  RewriteCond %{HTTP_USER_AGENT} ^rogerbot [OR]
                  RewriteCond %{HTTP_USER_AGENT} ^exabot [OR]
                  RewriteCond %{HTTP_USER_AGENT} ^MJ12bot [OR]
                  RewriteCond %{HTTP_USER_AGENT} ^dotbot [OR]
                  RewriteCond %{HTTP_USER_AGENT} ^gigabot [OR]
                  RewriteCond %{HTTP_USER_AGENT} ^AhrefsBot
                  RewriteRule .* - [F]

                  ...though I may be wrong.
                  Looks good, and if you want to get the IP address for each of these bots, insert some stats code in your 403 page that logs visitor details. It may take a few months to harvest the IP addresses, but you can then just block the bots by IP address.
                  {{ DiscussionBoard.errors[7650457].message }}
              • Profile picture of the author RevSEO
                Originally Posted by MarQueteer View Post

                I doubt it, but instead of solving this via the robots.txt, it isn't rocket science to block certain bots via .htaccess instead.

                The advantage: It's a real block, no matter if the bots respect your robots.txt or not (many bots simply ignore the robots.txt and still crawl everything) and noone except your server can read it.
                Even better advice, blocking it via htaccess is the best solution indeed since after all bots won't be able to see what's blocked unlike a robots.txt
                {{ DiscussionBoard.errors[5960668].message }}
          • Profile picture of the author rosesmark
            Originally Posted by mark@1to101 View Post

            For anyone who wants to do this, I think this is the text that you need to put in your robots.txt file...

            User-agent: *
            Disallow:

            User-agent: rogerbot
            Disallow: /

            User-agent: exabot
            Disallow: /

            User-agent: MJ12bot
            Disallow: /

            User-agent: dotbot
            Disallow: /

            User-agent: gigabot
            Disallow: /

            User-agent: AhrefsBot
            Disallow: /

            ...and then save the file and upload it to the root directory.

            Doing so should stop your network sites showing up in OSE, Ahrefs and Majestic SEO if competitors run a backlink analysis on your money sites or your clients' sites. I'll let you know if I find out the name of the SEO Spyglass bot.

            I feel that coding could be working for hiding backlinks from bots or users
            {{ DiscussionBoard.errors[5962044].message }}
  • Profile picture of the author Steadyon
    Are you putting this robots.txt file on the site you control, that are linking to your money site?

    Pardon my ignorance, but how does this stop these guys from finding links to your site in other ways?

    Perhaps by using yahoo, google and a few other ways.

    What you are suggesting is that the spiders you intend blocking are scouring the whole internet. Is this what those particular spiders do?

    So in other words those spiders scan the whole internet to see who is linking to whom etc.

    Is that right?
    {{ DiscussionBoard.errors[5959447].message }}
    • Profile picture of the author mark@1to101
      Originally Posted by Steadyon View Post

      Are you putting this robots.txt file on the site you control, that are linking to your money site?
      Yes, on the sites I control that link to money sites.

      Originally Posted by Steadyon View Post

      Pardon my ignorance, but how does this stop these guys from finding links to your site in other ways?

      Perhaps by using yahoo, google and a few other ways.
      Google doesn't give much in the way of backlink data outside of Webmaster Tools (and not even much in it) and so competitors can't use it to check out your backlink profile. Yahoo Site Explorer doesn't work any more. Any other ways that I know of aren't very effective at assessing backlink profiles.

      Originally Posted by Steadyon View Post

      What you are suggesting is that the spiders you intend blocking are scouring the whole internet. Is this what those particular spiders do?

      So in other words those spiders scan the whole internet to see who is linking to whom etc.
      The spiders being blocked are just those from the main backlink checkers which competitors will use to assess the backlink profiles of your money sites. You don't want them to see your network sites as they may send a spam report to Google.
      {{ DiscussionBoard.errors[5959544].message }}
  • Profile picture of the author alexmobile
    Backlink crawlers are not obligated to care about your robots.txt setup.... So do not be too much surprised when they just ignore whatever you put in your robots. Better option would be to block their IP addresses or even better have a custom plugin that would hide all external HREFs when request is coming from known crawlers IP
    Signature
    QRID.com - login to websites by scanning QR code on your monitor! No Passwords!
    {{ DiscussionBoard.errors[5961378].message }}
    • Profile picture of the author dp40oz
      Originally Posted by alexmobile View Post

      Backlink crawlers are not obligated to care about your robots.txt setup.... So do not be too much surprised when they just ignore whatever you put in your robots. Better option would be to block their IP addresses or even better have a custom plugin that would hide all external HREFs when request is coming from known crawlers IP
      Do you know of any plugins that could be customized to do that by chance?
      {{ DiscussionBoard.errors[5961451].message }}
    • Profile picture of the author Mike Anthony
      Originally Posted by alexmobile View Post

      Backlink crawlers are not obligated to care about your robots.txt setup.... So do not be too much surprised when they just ignore whatever you put in your robots
      Most backlink checkers honor robots.txt. As long as they do then I really wouldn't care about other bots

      Better option would be to block their IP addresses or even better have a custom plugin that would hide all external HREFs when request is coming from known crawlers IP
      A) too technical for most people
      B) you then have to track changes that may occur with IP addresses

      keeping up with a few bot names for the backlink checkers is much more simple and elegant and again as long as they honor the robots.txt you don't have to be concerned that other bots won't or don't.

      The .htaccess has merit though.
      Signature

      {{ DiscussionBoard.errors[5964598].message }}
  • Profile picture of the author linkservice
    interesting stuff would like to see more info on this
    {{ DiscussionBoard.errors[5961881].message }}
  • Profile picture of the author SpiderZq
    i also did alot search on this, but no success. may be any sharp warrior has solution.
    {{ DiscussionBoard.errors[5962059].message }}
  • Profile picture of the author adam westrop
    Anyword on SEO spyglass spiders? Anyway to find out what they would be?
    {{ DiscussionBoard.errors[5963121].message }}
    • Profile picture of the author mark@1to101
      Originally Posted by adam westrop View Post

      Anyword on SEO spyglass spiders? Anyway to find out what they would be?
      I've got a support ticket open with them to ask them but they haven't got back to me yet.

      I can see that it's not in their interests to tell me, but other backlink checkers openly give away details of their bots / spiders.
      {{ DiscussionBoard.errors[5963239].message }}
      • Profile picture of the author Mike Anthony
        Originally Posted by mark@1to101 View Post


        I can see that it's not in their interests to tell me, but other backlink checkers openly give away details of their bots / spiders.
        Its proper etiquette to give that out plus they just released an update to their software about creating robots.txt


        SEO PowerSuite to Allow Creating and Managing Robots.Txt Files Right from the Software

        So I would think they would oblige unless they are using a third party bot and want to keep it secret for marketing reasons
        Signature

        {{ DiscussionBoard.errors[5964662].message }}
        • Profile picture of the author mark@1to101
          Originally Posted by Mike Anthony View Post

          Its proper etiquette to give that out plus they just released an update to their software about creating robots.txt

          So I would think they would oblige unless they are using a third party bot and want to keep it secret for marketing reasons
          I've received a reply back from them now saying that only their senior developer knows that info and he's on vacation at the moment but they'll get back to me when he returns.
          {{ DiscussionBoard.errors[5964788].message }}
          • Profile picture of the author HairyKrishna
            Thanks Mark, great work!

            I actually recently blocked all bots except Google, but I would not recommend that in general. It's just that in my country, about 90% of searchers use Google anyway, and then 7% of the other search engines use Google's data for their results.

            I'm not expert enough to figure out the htaccess option, but I'm definitely part of the experiment to see if they do respect robots.txt.

            I guess it still takes a while before your backlinks drop from their systems.
            {{ DiscussionBoard.errors[5973765].message }}
  • Profile picture of the author juhlieri
    nice info.. i like this
    {{ DiscussionBoard.errors[5964734].message }}
  • Profile picture of the author adam westrop
    Thanks Mark. Please keep us updated, I would like to block them all so thats really good news.

    Im going to take GWMT off my sites due to a lot of people suggesting Google can track more about you..... So how do I amend the bots access or htcaccess? Which should I prefer?

    I use Wordpress.
    {{ DiscussionBoard.errors[5978578].message }}
  • Profile picture of the author HairyKrishna
    I guess I can confirm that robots.txt blocking is not effective. One of my competitors blocks mj12bot specifically, yet all his backlinks do show up in Majestic SEO, or at least just as many as SEO Spyglass.
    {{ DiscussionBoard.errors[6010236].message }}
    • Profile picture of the author adam westrop
      Originally Posted by HairyKrishna View Post

      I guess I can confirm that robots.txt blocking is not effective. One of my competitors blocks mj12bot specifically, yet all his backlinks do show up in Majestic SEO, or at least just as many as SEO Spyglass.
      It'll take a couple of months to see any effects.
      {{ DiscussionBoard.errors[6014362].message }}
      • Profile picture of the author HairyKrishna
        Originally Posted by adam westrop View Post

        It'll take a couple of months to see any effects.
        Unless my memory fails me, he has had this up for at least a year now. But definitely more than 6 months. I didn't understand why he had it at the time but just copied anyway for one of my sites. That site, too, still shows backlinks.

        So while a great idea, I'm not sure if it works.

        Either because the bots don't obey robots.txt, or what someone mentioned before, that they do index the page containing a link do your site, so it will show up no matter how skilfully you block the bots.

        So if you have private links networks, it would be good to block bots from those websites as well if possible.
        {{ DiscussionBoard.errors[6064940].message }}
        • Profile picture of the author mark@1to101
          Originally Posted by HairyKrishna View Post

          Unless my memory fails me, he has had this up for at least a year now. But definitely more than 6 months. I didn't understand why he had it at the time but just copied anyway for one of my sites. That site, too, still shows backlinks.

          So while a great idea, I'm not sure if it works.

          Either because the bots don't obey robots.txt, or what someone mentioned before, that they do index the page containing a link do your site, so it will show up no matter how skilfully you block the bots.

          So if you have private links networks, it would be good to block bots from those websites as well if possible.
          Whilst robots.txt can be ignored, apparently .htaccess files can't be. So maybe try the .htaccess option instead.
          {{ DiscussionBoard.errors[6065199].message }}
  • Profile picture of the author Svetislav
    after those all changes that Google made I wouldn't use any network they de-indexed some private networks as well I would wait couple of months and see if anything change but for now I wouldn't used ...
    {{ DiscussionBoard.errors[6010552].message }}
  • Profile picture of the author nomy
    Hi Mark,

    Any update on what the name of spyglass bot is?
    {{ DiscussionBoard.errors[6028900].message }}
    • Profile picture of the author mark@1to101
      Originally Posted by nomy View Post

      Hi Mark,

      Any update on what the name of spyglass bot is?
      They didn't get back to me.

      If I was them I wouldn't tell people the bot name either though. It's a risk for them because if lots of people started blocking their bot then their product would be less effective.
      {{ DiscussionBoard.errors[6030385].message }}
  • Profile picture of the author zoobie
    great info thanks!
    {{ DiscussionBoard.errors[6028944].message }}
  • Profile picture of the author dp40oz
    Wait a second. Doesn't SpyGlass get its information from other search engines. It doesn't have its own crawler. So you would need to block the search engines, Alexa ect… in order to block SpyGlass. I could be wrong but I think thats how it works.
    {{ DiscussionBoard.errors[6029009].message }}
    • Profile picture of the author Steviep
      Originally Posted by dp40oz View Post

      Wait a second. Doesn't SpyGlass get its information from other search engines.
      I think you're right there. They scrape results from multiple SEs and don't have their own index (haven't used it for over 12 months though, might have changed now)
      Signature

      Check out our SEO link building services at www.seospin.com [RESELLERS WELCOME!]
      Infographics, Press Releases, Web 2.0 Blog management, edu links & more...

      {{ DiscussionBoard.errors[6985908].message }}
  • Profile picture of the author nomy
    Just had an interesting conversation with a support agent at SEO Spyglass, who has told me that they will not provide the name of their bot. Below I have copied the responses that they gave to me in a short Live Chat:

    "we have our own clawler which is checking websites and collects backlinks"

    "sorry, but this info is not free fro public"

    "because our database is still in Beta and we are constantly making changes to it, so, do not provide full info about it yet"

    "sorry, but I can not hep you with this. This is our current policy."

    As you can see some lame excuses!

    Is there any other way we can find out?
    {{ DiscussionBoard.errors[6029045].message }}
  • Profile picture of the author junne
    Banned
    [DELETED]
    {{ DiscussionBoard.errors[6029675].message }}
    • Profile picture of the author adam westrop
      Originally Posted by junne View Post

      Do it in reverse manner allow only crawlers of some popular search engines like
      googlebot, spurl, msnbot..and other popular search engines who are bringing traffic to your website.

      This may block other and allow only selective..
      So we can set to only allow googlebot, spurl, msnbot etc....

      Does googlebot not potentially have multiple bots? Anyone know how to do this in Wordpress... I aint really bothered with the tiny metacrawler search engines, just the big uns and I imagine all of their bot names will be freely available??????
      {{ DiscussionBoard.errors[6029701].message }}
      • Profile picture of the author Mike Anthony
        Originally Posted by adam westrop View Post

        Does googlebot not potentially have multiple bots? Anyone know how to do this in Wordpress...
        Robots.txt is a site file not a Wordpress feature. I suppose there might be some add on that might facilitate it though.

        As for SEOspyglass not wanting to share their bot info could be because its in beta (or because they are using a third party crawler they don;t want people to kow they are using) but it will be found out just will take some time, effort and comparison- if you have links showing in spyglass you can look in your web traffic reporting packages and see what bots have visited.
        Signature

        {{ DiscussionBoard.errors[6029812].message }}
      • Profile picture of the author mark@1to101
        Originally Posted by adam westrop View Post

        So we can set to only allow googlebot, spurl, msnbot etc....

        Does googlebot not potentially have multiple bots? Anyone know how to do this in Wordpress... I aint really bothered with the tiny metacrawler search engines, just the big uns and I imagine all of their bot names will be freely available??????
        To set-up your robots.txt to only allow the major search engine bots then I think you would need to do the following...

        User-agent: googlebot
        Disallow:

        User-agent: msnbot
        Disallow:

        User-agent: yahoobot (or whatever it's called)
        Disallow:

        User-agent: *
        Disallow: /

        ...the danger with this approach is though if they have multiple bots that do different things and that they don't make public.
        {{ DiscussionBoard.errors[6030403].message }}
  • Profile picture of the author Terry Kyle
    Spyglass currently pulls data from Alexa, Blekko, Dogpile (metasearch tool would have to be blocked at their search partner level), Google, Google Blog Search, Exalead, Icerocket Blog Search + their own crawling.
    {{ DiscussionBoard.errors[7147571].message }}
    • Profile picture of the author angelx
      Any update on this? I've tried the method above with the robots.txt file, but have noticed that my backlinks are still published on these sites anyway?

      Would really love to find a solution that works if anyone knows one
      {{ DiscussionBoard.errors[7148930].message }}
  • Profile picture of the author gotlinks
    My question is, why block the backlink checkers?

    The engines will still find them, what does it matter if the backlink checkers find them? They arent reporting to google.
    Signature
    Learn the secrets to growing your Youtube Channel!

    - Feel free to private message me about anything. I love to help people and you are definitely no exception!
    {{ DiscussionBoard.errors[7149066].message }}
    • Profile picture of the author MikeFriedman
      Originally Posted by gotlinks View Post

      My question is, why block the backlink checkers?

      The engines will still find them, what does it matter if the backlink checkers find them? They arent reporting to google.
      They are reporting to your competitors who can report to Google.
      {{ DiscussionBoard.errors[7149079].message }}
      • Profile picture of the author gotlinks
        Originally Posted by MikeFriedman View Post

        They are reporting to your competitors who can report to Google.
        Yeah. But as long as you have good quality links does it matter?
        Signature
        Learn the secrets to growing your Youtube Channel!

        - Feel free to private message me about anything. I love to help people and you are definitely no exception!
        {{ DiscussionBoard.errors[7149095].message }}
        • Profile picture of the author MikeFriedman
          Originally Posted by gotlinks View Post

          Yeah. But as long as you have good quality links does it matter?
          If you are setting up a link network, you probably don't want to take any chances.
          {{ DiscussionBoard.errors[7149186].message }}
        • Profile picture of the author Mike Anthony
          Originally Posted by gotlinks View Post

          Yeah. But as long as you have good quality links does it matter?
          You can have good quality links that your competitors can get too if they know where you are getting yours from. If you for example only have links from being featured in an article where its unlikely they would give links to your competitor then no you don't need to block
          Signature

          {{ DiscussionBoard.errors[7149366].message }}
          • Profile picture of the author gotlinks
            Originally Posted by Mike Anthony View Post

            You can have good quality links that your competitors can get too if they know where you are getting yours from. If you for example only have links from being featured in an article where its unlikely they would give links to your competitor then no you don't need to block
            I guess that makes sense.

            Thanks for clearing that up bud!
            Signature
            Learn the secrets to growing your Youtube Channel!

            - Feel free to private message me about anything. I love to help people and you are definitely no exception!
            {{ DiscussionBoard.errors[7149382].message }}
          • Profile picture of the author nik0
            Banned
            Originally Posted by Mike Anthony View Post

            You can have good quality links that your competitors can get too if they know where you are getting yours from.
            You don't block the crawlers on your site, you block them on the site linking to you.
            {{ DiscussionBoard.errors[9389837].message }}
    • Profile picture of the author cuti04061990
      Originally Posted by gotlinks View Post

      My question is, why block the backlink checkers?

      The engines will still find them, what does it matter if the backlink checkers find them? They arent reporting to google.
      beacause i dont want anyone saw my backlink in ahrefs
      {{ DiscussionBoard.errors[8662939].message }}
    • Profile picture of the author Ryan3
      Originally Posted by gotlinks View Post

      My question is, why block the backlink checkers?

      The engines will still find them, what does it matter if the backlink checkers find them? They arent reporting to google.
      Cuz you don't want your competitors knowing how your ranking

      It also helps to redirect all those bots to your competitors
      Signature
      "Start"
      {{ DiscussionBoard.errors[9383956].message }}
      • Profile picture of the author CTRMAX
        Originally Posted by Ryan3 View Post

        Cuz you don't want your competitors knowing how your ranking

        It also helps to redirect all those bots to your competitors
        Also, the link:domain.com operator doesn't do much more. Google stop showing us that data many years ago.
        {{ DiscussionBoard.errors[9383987].message }}
  • Profile picture of the author billionareHuman
    I implemented the htaccess block accross all my sites but I can see majestic has picked up all of them

    Has this happened to anyone else? I'm wondering how the bot can get around this? I can see the date that Majestic indicates that they have first seen and indexed the backlink is after the date I implemented the htaccess blocks.

    I did a little PHP test to fake a majestic bot and my site indeed blocks them

    $ch1 = curl_init();
    curl_setopt($ch1, CURLOPT_URL, "http://mysite.com/");
    curl_setopt($ch1, CURLOPT_HEADER, 0);
    curl_setopt($ch1, CURLOPT_USERAGENT, '^MJ12bot');
    $c = curl_exec($ch1);
    print_r($c);


    returns the html

    <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
    <html><head>
    <title>403 Forbidden</title>
    </head><body>
    <h1>Forbidden</h1>
    <p>You don't have permission to access /
    on this server.</p>
    <p>Additionally, a 500 Internal Server Error
    error was encountered while trying to use an ErrorDocument to handle the request.</p>
    </body></html>


    So either Majestic is not identifying itself all the time or they are getting around it somehow?

    Guess I will have to do the robots.txt too but that is public then, but I noted that someone else said their robots.txt doesn't block it and another person said it takes a few months to go into effect.

    I don't see why it takes a few months, Majestic indicates they found the link after I implemented the htaccess block. Unless they are not showing the real date they found the backlink and did find it early just they didn't publish it on their site or something
    {{ DiscussionBoard.errors[7796183].message }}
  • Profile picture of the author billionareHuman
    ok think I found the answer in another forum, the dude said htaccess blocks all the bots except Majestic so he used a robots.txt and that blocked Majestic

    So Majestic gets around the htaccess somehow, so I might just put Majestic on the robots or it might be safer just to put in them all in again, I don't like how it's public though
    {{ DiscussionBoard.errors[7796246].message }}
    • Profile picture of the author networm
      Originally Posted by billionareHuman View Post

      ok think I found the answer in another forum, the dude said htaccess blocks all the bots except Majestic so he used a robots.txt and that blocked Majestic

      So Majestic gets around the htaccess somehow, so I might just put Majestic on the robots or it might be safer just to put in them all in again, I don't like how it's public though
      Where did you find it and how to implement it?

      Can you share the link, please?
      {{ DiscussionBoard.errors[7967951].message }}
  • Profile picture of the author standfast999
    if wordpress use statpress on a test site and monitor the bots in the plugin.

    Great post
    {{ DiscussionBoard.errors[7973288].message }}
    • Profile picture of the author CTRMAX
      I built a wordpress plugin called Spyder Spanker to do this for Wordpress users who are non-techie.

      Many of my SEO custs use it to do this and other things.

      It's a WSO if you want to read about it. I am not trying to be spammy or pushy, just thought I would mention it.

      Thanks,
      Todd
      {{ DiscussionBoard.errors[8311044].message }}
      • Profile picture of the author KyGunator
        Originally Posted by CTRMAX View Post

        I built a wordpress plugin called Spyder Spanker to do this for Wordpress users who are non-techie.

        Many of my SEO custs use it to do this and other things.

        It's a WSO if you want to read about it. I am not trying to be spammy or pushy, just thought I would mention it.

        Thanks,
        Todd
        I use it.

        Saves a ton of time, and YES it is essential. On some of my sites it gives a fatal error on installation, but 95% of the time it works great. Thanks for the great product my friend.
        {{ DiscussionBoard.errors[8684700].message }}
  • Profile picture of the author Jesus M
    Hello there, You guys might find this interesting:

    hxxp://spyderspanker.com (not affiliate link)
    {{ DiscussionBoard.errors[8683626].message }}
  • Profile picture of the author V12
    Spider Spanker looks interesting.
    {{ DiscussionBoard.errors[8684037].message }}
  • Profile picture of the author asokr
    Use .htaccess to block the bots:
    # Begin HackRepair.com Blacklist
    RewriteEngine on
    # Abuse Agent Blocking
    RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Bolt 0 [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Bot mailto:craftbot@yahoo.com [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} CazoodleBot [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Custo [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Default Browser 0 [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^DIIbot [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^DISCo [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} discobot [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Download Demon [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^eCatch [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ecxi [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^EmailCollector [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Express WebPictures [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^FlashGet [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^GetRight [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^GrabNet [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Grafula [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} GT::WWW [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} heritrix [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^HMView [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} HTTP::Lite [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ia_archiver [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} IDBot [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} id-search [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} id-search.org [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Image Stripper [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Image Sucker [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} Indy Library [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^InterGET [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Internet Ninja [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^InternetSeer.com [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} IRLbot [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ISC Systems iRc Search 2.1 [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Java [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^JetCar [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^JOC Web Spider [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^larbin [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} libwww [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} libwww-perl [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Link [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} LinksManager.com_bot [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} linkwalker [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} lwp-trivial [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Mass Downloader [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Maxthon$ [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} MFC_Tear_Sample [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^microsoft.url [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} Microsoft URL Control [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^MIDown tool [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Mister PiX [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} Missigua Locator [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Mozilla.*Indy [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Mozilla.*NEWT [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^MSFrontPage [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Navroad [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^NearSite [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^NetAnts [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^NetSpider [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Net Vampire [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^NetZIP [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Nutch [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Octopus [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Offline Explorer [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Offline Navigator [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} panscient.com [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Papa Foto [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^pavuk [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} PECL::HTTP [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^PeoplePal [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} PHPCrawl [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} PleaseCrawl [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^psbot [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^RealDownload [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^ReGet [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Rippers 0 [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} SBIder [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^SeaMonkey$ [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^sitecheck.internetseer.com [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} Snoopy [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} Steeler [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^SuperBot [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Surfbot [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Teleport Pro [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Toata dragostea mea pentru diavola [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} URI::Fetch [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} urllib [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} User-Agent [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Web Image Collector [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Web Sucker [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} Web Sucker [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} webalta [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^WebAuto [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^[Ww]eb[Bb]andit [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} WebCollage [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^WebCopier [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^WebFetch [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^WebGo IS [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^WebReaper [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^WebSauger [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Website eXtractor [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Website Quester [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^WebStripper [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^WebZIP [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} Wells Search II [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} WEP Search [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Wget [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Widow [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^WWW-Mechanize [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Xaldon WebSpider [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} zermelo [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Zeus [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^(.*)Zeus.*Webster [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ZyBorg [NC]
    RewriteRule ^.* - [F,L]
    # Abuse bot blocking rule end
    # End HackRepair.com Blacklist
    {{ DiscussionBoard.errors[8687861].message }}
  • Profile picture of the author networm
    @asokr

    That can also be done automatically for those who are using iThemes security WP plugin. Then, the additional bots to be blocked can simply be added.

    I don't know if this works the same with Spyder Spanker plugin as the author posted, maybe it does through PHP... I'm not sure.

    But, one think I noticed lately, no matter how you blocked Majestic, either through htaccess or Spyder Spanker, it still manage to fetched backlinks. And, doing a bit of search, I've found an article posted by Majestic SEO at their blog, that blocking their crawler won't actually and totally prevent them from crawling and detecting the backlinks. You can read more of this here;
    Code:
    http://blog.majesticseo.com/general/verifying-site-majesticseo-every-seo-audit/
    Refer to this section;
    Why should I let Majestic’s bot spider my site?

    It’s a fair question. Bear in mind that you are really not “hiding” your website much by blocking our bot, as we show you links INTO a website. We do not have to actually crawl your site to know about it. If you look at road map, it says nothing about the size of a town, but you can pretty much deduce this just by looking at the size and number of rads into town. It’s the same with the Link graph. So blocking our bot only stops us looking at the links OUT of your website to other sites.
    So, anyone who has good knowledge of it can explain it?

    Perhaps, Spyder Spanker author can explain about how to block it because I'm using Spyder Spanker Pro on several sites and they all share the same results in Majestic.

    Ahrefs on the other hand tends to be obedient using htaccess, or robots.txt.
    {{ DiscussionBoard.errors[9381314].message }}
  • Profile picture of the author watkip
    I'm guessing Google might have an algoritme to start look for PBN footprints when their spiders notice all these blocks in .htaccess
    {{ DiscussionBoard.errors[9381488].message }}
    • Profile picture of the author CTRMAX
      Hi,

      Let me try to clear up some questions. Since I started the Seo Hosting craze in 2004, Ive been blocking bot ever since.

      1) Google now anyone can read your .htacess file from the web.

      2) Spyder Spanker PRO (for SEO's) is designed to stop the Backlink checkers from crawling your private blog network and seeing where you are linking to. PBN's are not cheap and if your competition knows what they all are then they can try and take them down with ddos or other evil ways. Blocking them on your money site is not enough and Spyder Spanker only stop bots from crawling sites that you put it on.

      3) SS PRO does not use .htacess, when using .htacess to block spiders for some reason it doesn't always work and most people will fry their site trying to edit .htacess. We do not touch .htaccess for this reason.

      4) Any service (like MOZ) can spoof their useragent and so that is why with SS PRO we can block entire IP ranges.

      5) If you get a Fatal Error you need to ask your host to install ION CUBE Loaders, its free , its easy and requires not maint. If a tech says its working and you still get Fatal Error, ask for a more experienced linux administrator to help you.


      -Cheers

      Todd S.
      {{ DiscussionBoard.errors[9383732].message }}
      • Profile picture of the author MikeFriedman
        Originally Posted by CTRMAX View Post

        3) SS PRO does not use .htacess, when using .htacess to block spiders for some reason it doesn't always work and most people will fry their site trying to edit .htacess. We do not touch .htaccess for this reason.
        I missed this part before. If it is not writing to .htaccess file or robots.txt, it is a much more difficult footprint to track down, and would be much more resource intensive for search engines to try to do so.

        Anyone blocking with robots.txt or .htaccess is easy to spot. Google already has that data indexed.

        As long as it is not resource intensive (because Wordpress is already a slow enough POS as it is), you should be okay using it.
        {{ DiscussionBoard.errors[9391773].message }}
        • Profile picture of the author chris_87
          Originally Posted by MikeFriedman View Post

          Anyone blocking with robots.txt or .htaccess is easy to spot. Google already has that data indexed.
          Confused on this point. Htaccess is only viewable by using the root directory of your web server. Visitors are not able to read it.
          {{ DiscussionBoard.errors[9391802].message }}
          • Profile picture of the author CTRMAX
            Normally .htaccess is considered a hidden file and will show as forbidden. I dont touch .htacess cuz I see soooo many users break their sites when jacking with it and then they blame the host (me), which isn't fair.

            Robots on the other hand is read by all, and really should not even be used. No bot *must* abide by robots.txt rules and most bots just ignore it anyway. Never trust robots.txt its worthless.


            Originally Posted by chris_87 View Post

            Confused on this point. Htaccess is only viewable by using the root directory of your web server. Visitors are not able to read it.
            {{ DiscussionBoard.errors[9391814].message }}
          • Profile picture of the author MikeFriedman
            Originally Posted by chris_87 View Post

            Confused on this point. Htaccess is only viewable by using the root directory of your web server. Visitors are not able to read it.
            Sorry. Most of the time nobody can see the .htaccess file. However, there are some hosts that do not have the server config file set correctly. They are few and far between, but considering most people with a private network are looking for dirt cheap hosts, you may encounter them a little more often.

            I've found a few in the past.
            {{ DiscussionBoard.errors[9391835].message }}
  • Profile picture of the author danparks
    Spyder Spanker looks like an interesting product, but I have one concern about it. Perhaps CTRMAX can address this. If a person installs it on a large number of PBN sites, wouldn't that leave a big footprint? About the only people interested in hiding backlink sources are people running PBNs, so wouldn't it raise a red flag if Google sees this plug-in on many sites delivering backlinks to one money site?
    {{ DiscussionBoard.errors[9389181].message }}
    • Profile picture of the author MikeFriedman
      Originally Posted by danparks View Post

      Spyder Spanker looks like an interesting product, but I have one concern about it. Perhaps CTRMAX can address this. If a person installs it on a large number of PBN sites, wouldn't that leave a big footprint? About the only people interested in hiding backlink sources are people running PBNs, so wouldn't it raise a red flag if Google sees this plug-in on many sites delivering backlinks to one money site?
      If I was trying to track down private networks, it is certainly one way I would target them.
      {{ DiscussionBoard.errors[9389210].message }}
  • Profile picture of the author tyronne78
    Spyder Spanker does the job quite well,plus you can install it on an unlimited number of sites.

    Spyder Spanker Bot Blocker | Stop Rouge Spyders and Bots!Spyder Spanker Bot Blocker
    {{ DiscussionBoard.errors[9390500].message }}
  • Profile picture of the author danparks
    Originally Posted by Mike Anthony View Post

    You can have good quality links that your competitors can get too if they know where you are getting yours from.
    Originally Posted by nik0 View Post

    You don't block the crawlers on your site, you block them on the site linking to you.
    Unless I'm terribly mistaken about Spyder Spanker (which I admit is a possibility), then I believe Mike is correct here. You use Spyder Spanker to block crawlers from seeing your PBN. Then, when someone looks at backlinks for your money site, your PBN backlinks don't show up. Thus preventing competitors from seeing where your money site is getting its backlinks from.

    I'm still hoping someone (like CTRMAX) will address my previously mentioned concern:

    Originally Posted by danparks View Post

    Spyder Spanker looks like an interesting product, but I have one concern about it. Perhaps CTRMAX can address this. If a person installs it on a large number of PBN sites, wouldn't that leave a big footprint? About the only people interested in hiding backlink sources are people running PBNs, so wouldn't it raise a red flag if Google sees this plug-in on many sites delivering backlinks to one money site?
    {{ DiscussionBoard.errors[9391103].message }}
    • Profile picture of the author MikeFriedman
      Originally Posted by danparks View Post

      I'm still hoping someone (like CTRMAX) will address my previously mentioned concern:

      Truthfully, he's not going to be able to answer that for you. How would he have any clue how Google may or may not be using that information now, or what they might do with it in the future?

      In my opinion, it can create a pretty obvious footprint. They could easily track down sites blocking those spiders and then manually review them. It would be an extremely easy way to identify quite a few private network sites.
      {{ DiscussionBoard.errors[9391118].message }}
      • Profile picture of the author danparks
        Originally Posted by MikeFriedman View Post

        Truthfully, he's not going to be able to answer that for you.

        In my opinion, it can create a pretty obvious footprint.
        That is my thought as well. But hey, he's the creator of the product, so the footprint issue must have occurred to him, and I assume he's been asked about it before, so I thought I'd give him the opportunity to address it here. I'd love to hear his reply.
        {{ DiscussionBoard.errors[9391145].message }}
        • Profile picture of the author MikeFriedman
          Originally Posted by danparks View Post

          That is my thought as well. But hey, he's the creator of the product, so the footprint issue must have occurred to him, and I assume he's been asked about it before, so I thought I'd give him the opportunity to address it here. I'd love to hear his reply.
          Lol...

          Well, what do you really think he is going to say?

          There are 2 reasons people would use this tool.

          1) To block spiders to save bandwidth and conserve resources on their hosting account.

          2) To block spiders from common backlink services to try to hide their private networks, which is I bet 99% of its user base.

          To say anything other than, "No. There's no footprint there. Nothing to worry about," could drastically hurt sales.

          Just something to think about.
          {{ DiscussionBoard.errors[9391157].message }}
    • Profile picture of the author Mike Anthony
      Originally Posted by danparks View Post

      Unless I'm terribly mistaken about Spyder Spanker (which I admit is a possibility), then I believe Mike is correct here. You use Spyder Spanker to block crawlers from seeing your PBN. :
      I can only think Nik0 was referring to money sites but since you own the PBNs its pretty much what I said. theres no way you can block other sites that link to you unless you are hacking into them.
      Signature

      {{ DiscussionBoard.errors[9399695].message }}
  • Profile picture of the author gas
    What if you used one of those plugins that can hide what theme and plugins you are using, would that help to cover the footprint left by using such a plugin as Spyder Spanker?
    {{ DiscussionBoard.errors[9391219].message }}
    • Profile picture of the author MikeFriedman
      Originally Posted by gas View Post

      What if you used one of those plugins that can hide what theme and plugins you are using, would that help to cover the footprint left by using such a plugin as Spyder Spanker?
      Wordpress has nothing to do with it.
      {{ DiscussionBoard.errors[9391240].message }}
      • Profile picture of the author gas
        Originally Posted by MikeFriedman View Post

        Wordpress has nothing to do with it.
        Oh, I thought Spyder Spanker is a Wordpress plugin and that people use it to hide their Wordpress PBN sites from common backlink services like Open Site Explorer, Majestic, Ahrefs & Spyglass but some are worried about a footprint issue if they use it. I must have been reading a different thread
        {{ DiscussionBoard.errors[9391334].message }}
        • Profile picture of the author MikeFriedman
          Originally Posted by gas View Post

          Oh, I thought Spyder Spanker is a Wordpress plugin and that people use it to hide their Wordpress PBN sites from common backlink services like Open Site Explorer, Majestic, Ahrefs & Spyglass but some are worried about a footprint issue if they use it. I must have been reading a different thread
          The footprint we are talking about has NOTHING to do with Wordpress. You could create the exact same footprint on any site. Hiding that the site is a Wordpress site does nothing to hide this footprint.
          {{ DiscussionBoard.errors[9391534].message }}
          • Profile picture of the author CTRMAX
            Hey everyone (again),

            Let me start with a little background so maybe the "PRO's" in this tread will give me a shred of credibility. I started in 1996. It's probably safe to say that I have been doing seo online longer than most of you have been out of high school.

            First, about my SEO skills. I am the one who started invented SEO Hosting in 2003, 11 years ago. Yes, I am the one that started it all including Private Blog Networking, PBNS werent even the rage until I made the first course on it 3 years ago. Anyway, whatever, it doesn't really matter.

            Right now I still own the original seo hosting company that started it all and I've been blocking spyders of all kinds for about 8 years now. So I know a little bit about them.

            Currently, I see google crawling my network of servers with 10,000 threads every second of everyday. If some footprint were a problem then I would be out of business. All my customers use SS.

            I have never ever had a network pegged and put down due to my plugins or my hosting. Ever.

            Also, I am not hear to convince anyone to buy it. I could really care less. It's not my main business, and I am not some "guru" product launcher that needs a few plugin sales to pay for my internet bill. Ive been MMO since 1996, and I was a lurker/member of WF in the late 90's when it just a bunch of spammers *cough* I mean opt-out email marketers.

            Sidenote: Why do you think its called Warriors?

            Anyways, SS is just a helpful plugin I starting deving years ago and a few years ago i decided to make it public. But again, You got it and love it or you don't, either way it doesn't matter to me. ( I am going to take it off here pretty soon anyways)

            The bottom line is this, google is not checking your site to see who your blocking. Do you really think they really care if you blocking scrapers hackers and spammers?

            They have a lot or resources yes but do you really think they would rescan the web looking for that? They are not doing that today and to think that would go ahead and spend a billion doing it, is a bit paranoid. That is my opinion. -- Is it possible yes, is traveling 200 MPH for 500 miles on motorcycle possible? Yes, practical , No.

            I think there are better conspiracy theories to chase up than that one and they would be more fun too. Anyway, have a great day.. Im prob done with this thread.

            Thanks to all the cool peeps out there that do use SS and thanks for the great feedback too, I man not say it but it makes me proud to see it !

            -Peace.
            {{ DiscussionBoard.errors[9391650].message }}
            • Profile picture of the author Mike Anthony
              Originally Posted by CTRMAX View Post

              They have a lot or resources yes but do you really think they would rescan the web looking for that? They are not doing that today and to think that would go ahead and spend a billion doing it, is a bit paranoid. That is my opinion. -- Is it possible yes, is traveling 200 MPH for 500 miles on motorcycle possible? Yes, practical , No.
              This is quite the oversell. I have no problem with your software or plugin (used in moderation). I suspect a lot of people who wouldn't block do so because of a plug in to do it. However to try and sell that you know what google does not record blocked sites (after all its a programmatic event that can be recorded in a database) is a bit much and certainly would NOT cost a billion dollars to do.

              Shucks I could actually see Moz or majestic sharing that information with google under the right beneficial circumstances. Its not personal data and certainly Moz is no great proponent (at least publicly) of SEO networks.

              My concern for blocking and tools that allow you to do it is that people use that as an excuse to make all their PBN sites crappy. the only absolute defense is to make your sites natural looking which I admit is not possible for everyone but you should at least have some that could pass a manual review.

              Incidentally I have never heard of you. Over Three years ago the course that got people going on PBNs was Terry Kyle's. You associated with him?
              Signature

              {{ DiscussionBoard.errors[9399776].message }}
              • Profile picture of the author CTRMAX
                Umm ok , Ill indulge you Mike Anthony.. I could use a little post count inflation, i guess .

                What is oversold again? My opinion of google and their expense sheet? Does it really matter?

                So tell me exactly how google would know who all I am blocking? Afterall, you said "(after all its a programmatic event that can be recorded in a database)"

                Seriously, no BS hypothetical crap, just tell me how google does that to every site in the world and records it. And then tell me what that would cost in computational value, if its not a billion then what is it, obviously im wrong.

                So, you would use my plugin in moderation? why? If I want to block scrapers and spammers from my sites, why would I do that only on a few and not all?

                You think most people block ip and robots just cause its a plugin and they have nothing better to do with their time and money. They do it because I have brought to their attention what is going on under their nose and provide an easy way to do it, instead of mucking up their site with .htaccess edits. They do it because its their right to privacy, they do it because they do not want others to steal their data and then sell it for profit. There are many reasons, Mike.

                I think the oversell is your Shucks comment. Goog hates moz and all the seo tools out there. They are actually cutting seo tools API access.

                Why you would even think SS is primarily used to hide a crappy PBN. Hide it from who? Who is the only one in the world that cares if a site is crappy? Right, -- google-- and hiding a site from google defeats the whole purpose. So let me know, what are you talking about there. Overall, this is totally missing the point of this thread, that has nothing to do with it all. Besides, making crappy PBNS has been over for years.

                Incidentally, I've never heard of you either. I started SEO hosting and blog networking back in 2003-04, I revealed this at the Traffic Domination Workshop Mastermind with Ryan Deiss and Keith Baxter, ( i dont think you were there or in any of my circles for that matter) you can ask them for my credentials if you like. Ill never ever have a big post count or a WF badge, if that's your measuring stick.

                And finally, after some prodding, I made PBN course on it a few years ago. Gave it away to most and sold it to a few. Its not rocket science that's for sure.

                And I did not know Terry until he approached me about SS a year ago.

                So are you associated with him? or anyone else I should know about? And why does it matter who I am associated with? Seriously tho, I am kidding, I don't care who you are associated with.

                I am not really even sure why you feel its necessary to come over the top and antagonize me and try to call me out like I don't know what I am talking about.

                Personally, I think this thread should be stopped and deleted if possible, it was a simple question that's been beat to death and drug thru the mud now.

                This is another example of why I don't post here, so much disrespect when big post count comes up on little post count with a big idea, lol!

                -Ts


                Originally Posted by Mike Anthony View Post

                This is quite the oversell. I have no problem with your software or plugin (used in moderation). I suspect a lot of people who wouldn't block do so because of a plug in to do it. However to try and sell that you know what google does not record blocked sites (after all its a programmatic event that can be recorded in a database) is a bit much and certainly would NOT cost a billion dollars to do.

                Shucks I could actually see Moz or majestic sharing that information with google under the right beneficial circumstances. Its not personal data and certainly Moz is no great proponent (at least publicly) of SEO networks.

                My concern for blocking and tools that allow you to do it is that people use that as an excuse to make all their PBN sites crappy. the only absolute defense is to make your sites natural looking which I admit is not possible for everyone but you should at least have some that could pass a manual review.

                Incidentally I have never heard of you. Over Three years ago the course that got people going on PBNs was Terry Kyle's. You associated with him?
                {{ DiscussionBoard.errors[9400194].message }}
          • Profile picture of the author gas
            Originally Posted by MikeFriedman View Post

            The footprint we are talking about has NOTHING to do with Wordpress. You could create the exact same footprint on any site. Hiding that the site is a Wordpress site does nothing to hide this footprint.

            Originally Posted by danparks View Post

            Spyder Spanker looks like an interesting product, but I have one concern about it. Perhaps CTRMAX can address this. If a person installs it on a large number of PBN sites, wouldn't that leave a big footprint? About the only people interested in hiding backlink sources are people running PBNs, so wouldn't it raise a red flag if Google sees this plug-in on many sites delivering backlinks to one money site?
            Not talking about a Wordpress footprint here then?
            {{ DiscussionBoard.errors[9392057].message }}
            • Profile picture of the author MikeFriedman
              Originally Posted by gas View Post

              Not talking about a Wordpress footprint here then?
              No. We were talking about blocking bots causing a footprint. That footprint is independent of wordpress.
              {{ DiscussionBoard.errors[9392101].message }}
  • Profile picture of the author SEOWizard417
    Spyder spanker works pretty well for this purpose. I see a lot of people are worried about it leaving a footprint, but it's better than the alternative of your competitors being able to find your network and report it to Google.

    PBNs can be pricey to setup and maintain, so you want to protect that investment as best as you can.
    Signature
    {{ DiscussionBoard.errors[9391271].message }}
  • Profile picture of the author Profit-smart
    Blocking purely from .htaccess/bot names is only half the battle. You also need to maintain a list of IPs that crawlers use. Ahrefs, Majestic, and so on, are generally what are known as "Bad bots". Meaning they dont always identify themselves truthfully.
    {{ DiscussionBoard.errors[9392092].message }}
    • Profile picture of the author CTRMAX
      Yes some spoof and some don't, (MOZ does, although they lie and say they don't do that) its not that hard to figure out nor it that hard to find their Netblocks. Due to these sites spoofing, the latest version has a "nuclear Option" which just stops them all, no need to figure out anything.

      Originally Posted by Profit-smart View Post

      Blocking purely from .htaccess/bot names is only half the battle. You also need to maintain a list of IPs that crawlers use. Ahrefs, Majestic, and so on, are generally what are known as "Bad bots". Meaning they dont always identify themselves truthfully.
      {{ DiscussionBoard.errors[9399611].message }}
  • Profile picture of the author Emmaboh
    What you need are Spyderspanker, Ghostpbn pluging. They will hide your backlink profiles from those bots
    {{ DiscussionBoard.errors[11002439].message }}
  • Profile picture of the author whitecap
    There are many plugins which can help you do that. You can also block bots using HT access.
    {{ DiscussionBoard.errors[11004704].message }}

Trending Topics