How to block someones IP address?

by Ani86
29 replies
HI,

I have been getting junk emails from automated bots on my website's form. I have installed a captcha and that worked for a while but a few days ago I started getting those annoying emails again.

I have the senders IP address. Is there a way to block the spammers IP address? My host is Yahoo and Yahoo does not support .htaccess.
(Trying to change hosting soon)

Is there a way to block the IP without using .htaccess?
#address #block #email #ipaddress #someones #spam
  • Profile picture of the author Nathan K
    Add this line on top of your header file, and enter the ipaddress.

    <?php
    $deny = array("000.000.000","000.000.001");
    if (in_array ($_SERVER['REMOTE_ADDR'], $deny)) {
    header("location: http://www.google.com/");
    exit();
    }
    ?>
    {{ DiscussionBoard.errors[8552123].message }}
    • Profile picture of the author sonzoy
      if your host is not supporting .htaccess then use Nathan K PHP script otherwise changing the host will be always an advantage ....
      {{ DiscussionBoard.errors[8553999].message }}
  • Profile picture of the author Kingfish85
    Originally Posted by Ani86 View Post

    HI,

    I have been getting junk emails from automated bots on my website's form. I have installed a captcha and that worked for a while but a few days ago I started getting those annoying emails again.

    I have the senders IP address. Is there a way to block the spammers IP address? My host is Yahoo and Yahoo does not support .htaccess.
    (Trying to change hosting soon)

    Is there a way to block the IP without using .htaccess?
    1. The IP's will most likely always be different.

    2. Yahoo doesn't support htaccess......yea, it's time to find a "real" web host.
    Signature

    |~| VeeroTech Hosting - sales @ veerotech.net
    |~| High Performance CloudLinux & LiteSpeed Powered Web Hosting
    |~| cPanel & WHM - Softaculous - Website Builder - R1Soft - SpamExperts
    |~| Visit us @veerotech Facebook - Twitter - LinkedIn

    {{ DiscussionBoard.errors[8552185].message }}
    • Profile picture of the author Ani86
      They say that they support htaccess.txt not .htaccess

      you think this will work if i use htaccess.txt ?
      {{ DiscussionBoard.errors[8552247].message }}
  • Profile picture of the author chrislim2888
    Personally, I'm not recommend the blocking at page level. Therefore, if possible change the hosting will be an option to consider, for long term solution.
    {{ DiscussionBoard.errors[8554158].message }}
    • Profile picture of the author Ani86
      I called yahoo again they said .htaccess is same as their htacess.txt they dont put the (.) in the beginning but instead they use .txt the tech guy said its the same thing.

      Now how can I block the IP address in the htaccess file. does anyone know how to write the code for that?

      and I'd like to learn more about Nathan K's code. anyone got a link that explains the code?

      I wrote this code but not sure if its correct or not

      <Limit GET>
      order allow,deny
      deny from 10.20.30
      deny from 192.168.207.154
      allow from all
      </Limit>
      {{ DiscussionBoard.errors[8554912].message }}
      • Profile picture of the author kyoo
        Originally Posted by Ani86 View Post

        and I'd like to learn more about Nathan K's code. anyone got a link that explains the code?
        Nathan's Code will only work on a PHP page, not a static HTML page.

        It Basically runs through a list of ip addresses (In the $deny variable) and if the browsing IP matches one of them, it redirects them somewhere else by writing a location header.

        Your apache code should work, but I always place the deny after the allows, because I'm not sure if it gets applied in order.

        <Limit GET>
        order allow,deny
        allow from all
        deny from 10.20.30
        deny from 192.168.207.154
        </Limit>

        [/QUOTE]
        {{ DiscussionBoard.errors[8554952].message }}
        • Profile picture of the author Ani86
          So if I use this code

          <?php
          $deny = array("000.000.000","000.000.001");
          if (in_array ($_SERVER['REMOTE_ADDR'], $deny)) {
          header("location: http://www.google.com/");
          exit();
          }
          ?>

          All I have to do is put the IP address in place of 000.000.000s and put the code in the beginning of the contact.php file? Is that all?
          {{ DiscussionBoard.errors[8555011].message }}
  • Profile picture of the author kyoo
    That should do it, although I haven't tested this snippet of code, there may be typos etc. Nathan K probably pulled it from someplace where it was working.
    {{ DiscussionBoard.errors[8555032].message }}
  • Profile picture of the author webzie
    Blocking him from your website would be hard, because all he needs to do is use a proxy to disguise his IP.. But you can block it from control panel of your hosting
    {{ DiscussionBoard.errors[8564909].message }}
    • Profile picture of the author Ani86
      Originally Posted by webzie View Post

      Blocking him from your website would be hard, because all he needs to do is use a proxy to disguise his IP.. But you can block it from control panel of your hosting
      How do i block from hosting? my hosting is 1and1
      {{ DiscussionBoard.errors[8640880].message }}
  • Profile picture of the author CrazyStyle
    Code:
    <?php
    if($_SERVER['REMOTE_ADDR'] == 'xx.xx.xx.xxx'){
            die("Go to hell.");
    }
    ?>
    {{ DiscussionBoard.errors[8567267].message }}
  • Profile picture of the author webisland
    You can write rewrite rules or you can write php or any serverside script to block ip address or country
    {{ DiscussionBoard.errors[8568004].message }}
  • Profile picture of the author WriteArm
    Ani 86 : Have you tried just using IP deny in the cpanel?
    {{ DiscussionBoard.errors[8664129].message }}
  • Profile picture of the author DAVE49
    Here you can find all information about your IP
    {{ DiscussionBoard.errors[8665961].message }}
  • Profile picture of the author project1010
    If you have programmatic control over your website, then you can block an IP address. For example, in VBScript for ASP,
    <%
    if ( request.servervariables ("remote_host") = "XXX.XXX.XXX.XXX") then response.redirect "http://www.google.com"
    %>

    where XXX.XXX.XXX.XXX is his IP address.
    {{ DiscussionBoard.errors[8667184].message }}
  • Profile picture of the author Kingfish85
    junk emails from automated bots

    This means the IP addressed CHANGE....they CHANGE = not the same.

    Adding code to the htaccess is NOT going to do anything here....do you people even read the thread before commenting with useless rpelies?!

    You need to use something like CloudFlare or a host that has a better firewall or WAF.
    Signature

    |~| VeeroTech Hosting - sales @ veerotech.net
    |~| High Performance CloudLinux & LiteSpeed Powered Web Hosting
    |~| cPanel & WHM - Softaculous - Website Builder - R1Soft - SpamExperts
    |~| Visit us @veerotech Facebook - Twitter - LinkedIn

    {{ DiscussionBoard.errors[8667212].message }}
  • Profile picture of the author buysellbrowse
    Since you have a website form visible from anywhere in the world, blocking a few IPs will not help much. You need a better captcha system that hasn't been broken yet, preferably something that requires the user to understand instructions, then move an object with the mouse to complete a task. Eye-hand coordination is something a computer cannot do (yet).
    Signature
    buysell-browse.com * Free Classifieds Advertising & Promotion *
    {{ DiscussionBoard.errors[8706131].message }}
  • Profile picture of the author Roy Jones
    You can block an IP Address using IPSec

    STEP 1:
    Click on Start menu, then click on Run

    STEP 2:
    Type " secpol.msc" and Click OK

    STEP 3:
    When the Local Security Settings console opens, Click on "IP security policies on Local computer"
    {{ DiscussionBoard.errors[8785471].message }}
    • Profile picture of the author shahriyar
      I receive a lot of spam every day on my sites, blocking IPs is a solution but not a good one. Because spam will come all kinds of different IPs, they never stay the same.

      You should focus on improving security on your forms. Have you tried using reCaptcha? http://www.google.com/recaptcha
      {{ DiscussionBoard.errors[8786042].message }}
      • Profile picture of the author Karen Blundell
        rather than blocking ip address for bots it is far more efficient to block the user-agent or referer in your .htaccess file or in the OP's case httaccess.txt -
        here is a pretty good one :

        Code:
        RewriteEngine On
        
        RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^Bot mailto:craftbot@yahoo.com [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^Custo [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^DISCo [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^Download Demon [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^eCatch [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^Express WebPictures [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^FlashGet [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^GetRight [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^GrabNet [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^Grafula [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^HMView [OR] 
        RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR] 
        RewriteCond %{HTTP_USER_AGENT} ^Image Stripper [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^Image Sucker [OR] 
        RewriteCond %{HTTP_USER_AGENT} Indy Library [NC,OR] 
        RewriteCond %{HTTP_USER_AGENT} ^InterGET [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^Internet Ninja [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^JetCar [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^JOC Web Spider [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^larbin [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^Mass Downloader [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^MIDown tool [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^Mister PiX [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^Navroad [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^NearSite [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^NetAnts [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^NetSpider [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^Net Vampire [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^NetZIP [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^Octopus [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^Offline Explorer [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^Offline Navigator [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^Papa Foto [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^pavuk [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^RealDownload [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^ReGet [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^SuperBot [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^Surfbot [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^Teleport Pro [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^Web Image Collector [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^Web Sucker [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^WebAuto [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^WebCopier [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^WebFetch [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^WebGo IS [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^WebReaper [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^WebSauger [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^Website eXtractor [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^Website Quester [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^WebStripper [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^WebZIP [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^Wget [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^Widow [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^Xaldon WebSpider [OR] 
        RewriteCond %{HTTP_USER_AGENT} ^Zeus [OR]
        RewriteCond %{HTTP_USER_AGENT} ^Java [OR]
        RewriteCond %{HTTP_USER_AGENT} ^Sogou web spider [OR]
        RewriteCond %{HTTP_USER_AGENT} ^Sosospider+ 
        RewriteRule ^.* - [F,L]
        You can find the referrers hitting your site by downloading your visitor logs and you can also look at your "latest visitors" in the cPanel -

        This is what you looking for - this is a good referer:
        Mozilla/5.0 (Macintosh; Intel Mac OS X 10.7; rv:26.0) Gecko/20100101 Firefox/26.0

        This is one of the bad ones because they ignore your robots .txt
        Mozilla/5.0 (compatible; Baiduspider/2.0; +http://www.baidu.com/search/spider.html)

        Good referers obey your robots.txt file
        Bad ones don't
        Signature
        ---------------
        {{ DiscussionBoard.errors[8786623].message }}
        • Profile picture of the author kpmedia
          Originally Posted by Karen Blundell View Post

          This is one of the bad ones because they ignore your robots .txt
          Mozilla/5.0 (compatible; Baiduspider/2.0; +百度æÂÅ"索帮助ä¸*å¿Æ'à ¢â‚¬"Ã¥...³äºŽBaiduspider)
          Baidu sucks. It's an aggressive Chinese search engine, and I long ago banned it.
          I have 0.001% legit users/traffic from China (i.e., maybe 1 per 1,000).
          {{ DiscussionBoard.errors[8787294].message }}
        • Profile picture of the author Kingfish85
          Originally Posted by Karen Blundell View Post

          rather than blocking ip address for bots it is far more efficient to block the user-agent or referer in your .htaccess file or in the OP's case httaccess.txt -
          here is a pretty good one :

          Code:
          RewriteEngine On
          
          RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^Bot mailto:craftbot@yahoo.com [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^Custo [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^DISCo [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^Download Demon [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^eCatch [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^Express WebPictures [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^FlashGet [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^GetRight [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^GrabNet [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^Grafula [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^HMView [OR] 
          RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR] 
          RewriteCond %{HTTP_USER_AGENT} ^Image Stripper [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^Image Sucker [OR] 
          RewriteCond %{HTTP_USER_AGENT} Indy Library [NC,OR] 
          RewriteCond %{HTTP_USER_AGENT} ^InterGET [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^Internet Ninja [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^JetCar [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^JOC Web Spider [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^larbin [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^Mass Downloader [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^MIDown tool [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^Mister PiX [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^Navroad [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^NearSite [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^NetAnts [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^NetSpider [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^Net Vampire [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^NetZIP [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^Octopus [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^Offline Explorer [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^Offline Navigator [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^Papa Foto [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^pavuk [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^RealDownload [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^ReGet [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^SuperBot [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^Surfbot [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^Teleport Pro [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^Web Image Collector [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^Web Sucker [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^WebAuto [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^WebCopier [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^WebFetch [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^WebGo IS [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^WebReaper [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^WebSauger [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^Website eXtractor [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^Website Quester [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^WebStripper [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^WebZIP [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^Wget [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^Widow [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^Xaldon WebSpider [OR] 
          RewriteCond %{HTTP_USER_AGENT} ^Zeus [OR]
          RewriteCond %{HTTP_USER_AGENT} ^Java [OR]
          RewriteCond %{HTTP_USER_AGENT} ^Sogou web spider [OR]
          RewriteCond %{HTTP_USER_AGENT} ^Sosospider+ 
          RewriteRule ^.* - [F,L]
          You can find the referrers hitting your site by downloading your visitor logs and you can also look at your "latest visitors" in the cPanel -

          This is what you looking for - this is a good referer:
          Mozilla/5.0 (Macintosh; Intel Mac OS X 10.7; rv:26.0) Gecko/20100101 Firefox/26.0

          This is one of the bad ones because they ignore your robots .txt
          Mozilla/5.0 (compatible; Baiduspider/2.0; +百度æÂÅ"索帮助ä¸*å¿Æ'à ¢â‚¬"Ã¥...³äºŽBaiduspider)

          Good referers obey your robots.txt file
          Bad ones don't


          Just remember, every line that's added to the htaccess file, has to be processed. This will certainly slow down a site.
          Signature

          |~| VeeroTech Hosting - sales @ veerotech.net
          |~| High Performance CloudLinux & LiteSpeed Powered Web Hosting
          |~| cPanel & WHM - Softaculous - Website Builder - R1Soft - SpamExperts
          |~| Visit us @veerotech Facebook - Twitter - LinkedIn

          {{ DiscussionBoard.errors[8788550].message }}
        • Profile picture of the author RobinInTexas
          Originally Posted by Karen Blundell View Post

          rather than blocking ip address for bots it is far more efficient to block the user-agent or referer in your .htaccess file or in the OP's case httaccess.txt -
          here is a pretty good one :

          Code:
          RewriteEngine On
          RewriteCond %{HTTP_USER_AGENT} ^Sogou web spider [OR]
          RewriteCond %{HTTP_USER_AGENT} ^Sosospider+ 
          RewriteRule ^.* - [F,L]
          You can find the referrers hitting your site by downloading your visitor logs and you can also look at your "latest visitors" in the cPanel -

          This is what you looking for - this is a good referer:
          Mozilla/5.0 (Macintosh; Intel Mac OS X 10.7; rv:26.0) Gecko/20100101 Firefox/26.0

          This is one of the bad ones because they ignore your robots .txt
          Mozilla/5.0 (compatible; Baiduspider/2.0; +百度æÂÅ"索帮助ä¸*å¿Æ'â €"Ã¥...³äºŽBaiduspider)

          Good referers obey your robots.txt file
          Bad ones don't
          Rather than
          Code:
          RewriteRule ^.* - [F,L]
          Which generates an error in your Apache log file.
          I prefer to use
          Code:
          RewriteRule (.*) ^http://%{REMOTE_ADDR}/$ [R=301,L]
          Which simply sends them on their way. Or if they are slow on the uptake.
          Signature

          Robin



          ...Even if you're on the right track, you'll get run over if you just set there.
          {{ DiscussionBoard.errors[8788562].message }}
  • Profile picture of the author nettiapina
    As others have suggested, blocking IPs to combat spam is futile. If someone is in that kind of business, they wont hesitate to proxy around your blocks.

    The IP addresses in the htaccess code weren't the ones you're trying to block, right? They're not public IP addresses.
    Signature
    Links in signature will not help your SEO. Not on this site, and not on any other forum.
    Who told me this? An ex Google web spam engineer.

    What's your excuse?
    {{ DiscussionBoard.errors[8787735].message }}
  • Profile picture of the author RobinInTexas
    It's going to take a while, but I am using Wordfence plugin on WordPress sites and blocking any and all bots I see with the exception of Google, Bing and Yahoo.
    Also blocking ip ranges for any web hosts I find.

    Starting to look into using CloudFlare to fence them out far away from my hosting accounts.
    Signature

    Robin



    ...Even if you're on the right track, you'll get run over if you just set there.
    {{ DiscussionBoard.errors[8788533].message }}
  • Profile picture of the author kpmedia
    I send mine to a honeypot.
    {{ DiscussionBoard.errors[8789790].message }}
  • Profile picture of the author wondy
    Let me riddle you this: if normal users are allowed to submit the form, how are you going to stop spammers from doing the same?

    You can't filter them out at the web level completely. Spammers can always just fill out the CAPTCHA themselves. What you need to do is install a reliable spam filter on your email server.
    {{ DiscussionBoard.errors[8791998].message }}
  • Profile picture of the author jkujami
    You can use this service to prevent any country or ip's from accessing your websites.

    MyIPBlocker - IP BLOCKER | COUNTRY BLOCKER

    Configure it properly and add a line to your page/s. It will do the rest for you. You can redirect the blocked traffic wherever you want as well and no need for .htcacess
    {{ DiscussionBoard.errors[8915510].message }}

Trending Topics