Wow, Protect Your Damn Download Pages!

20 replies
I cannot believe the amount of people who are not making any effort to protect their download pages. Let this be a warning.

I won't give any phrases out, but suffice to say if you do a strategically worded query in google, you will get tons of download pages which have been indexed with direct links to products.

If you're not going to protect download pages (or files for that matter) from direct access, you need to AT LEAST protect them from search engines.

Don't forget! These are quick fixes which have been improved on but will still not adequately protect your download pages. The main message here is: get a download manager.

ROBOTS.TXT VERSION


This can be done by putting a robots.txt in your ROOT directory with nothing in it but this:

User-agent: *
Disallow: /yourdownloadpage.html

You can also have multiple disallows:

User-agent: *
Disallow: /yourdownloadpage.html
Disallow: /dir/yourdownloadpage2.html

Finally, you can disallow an entire directory using the above formula and protect your files:

Disallow: /productdownloads/

Then chuck all your product files in that directory and you won't have the search engines indexing your files either.

Originally Posted by CDarklock View Post

If your robots.txt says "Disallow: /myreallycoolstuff"... who isn't going to look there?

Make a folder UNDER it with your download file in it. So people download from /myreallycoolstuff/adlkfjeoirhonecc/file.zip - and it will be harder to find.



Originally Posted by aandersen View Post

if you put it in /myreallycoolstuff/adlkfjeoirhonecc/file.zip

and you
Disallow: /myreallycoolstuff/

PLEASE disable directory listing in /myreallycoolstuff/

That's done with .htaccess file with the following contained:

Options -indexes

The file should go in the directory you want to protect, or in most cases - the root of your domain. This also assumes you have linux server (if you don't know what that is, you probably do).

META TAG VERSION

Easy peasy, put within the <head></head> section of your site:

<meta name="robots" content="noindex">

WARNING

Not all search engines play nice. Some WILL still index your site regardless of what you instruct them to do in your robots.txt file. I cannot stress the importance of using a decent download manager.

I've attached the robots.txt file here for anybody who needs it. Simply change as you see fit.

You can also do a quick fix with php - this is done by making sure the visitor is referred by your payment provider before the page or file is even allowed access. Again, not ideal, but it's a quick fix.

Hope this helps.

Best,
Damien.

*edit*

To clarify - this came to mind when I was doing a project recently which uses what I would say is quite a popular system. It didn't recommend or offer any kind of protection. So, naturally, I protected the pages using the methods above and advised the client they should use a secure download manager. All I had to do was search a particular "phrase" in google and wooooooshhhh..tons of download pages with direct access to the author's products.

Unbelievable.
#damn #download #pages #protect #wow
  • Profile picture of the author doop
    Definitely good advice!

    If you're going to be serious about making a living online or even a few extra dollars - invest in software to protect your downloads!!!

    It'd be akin to trying to sell something in the real world and keeping your stock sitting on the back dock rather than locked up in your storeroom!
    {{ DiscussionBoard.errors[2158097].message }}
  • Profile picture of the author JustKid
    Thats why I always password protect my pages also.
    {{ DiscussionBoard.errors[2158113].message }}
  • Profile picture of the author Damien Roche
    The story thickens. I mean, this is something I haven't looked into much because I don't have my own product and most of the people I've worked with already have download protection.

    Turns out, there are even pages put up to fool people into thinking they have accessed a download page but is likely a list of adware or something. I noticed a common template being used and the list is always .exe files. Jeez, the phrase is even in google's suggested search.

    Anyway, then I found a download page for a *very* experienced marketer on this forum. Of course, I didn't touch his links, but submitted a ticket to his support desk. Not sure if he was doing what I mentioned above to get these searchers back.

    ..and to emphasize, the above solution is by no way a *long term* solution. It is a quick fix until you get yourself some decent download protection.
    Signature
    >> Seasoned Web Developer (CSS, JavaScript, PHP, Ruby) <<
    Available for Fixed Fee Projects and Hourly ($40/hr)
    {{ DiscussionBoard.errors[2158140].message }}
  • Profile picture of the author mr2monster
    This is why most of my actual thank you pages are on my "product hub" and NOT on my sales site.


    i.e. If I were selling cool blue widgets on coolbluewidgets.com the customer would buy, then be forwarded to genericdomainthatallmyproductsarehostedon.com/coolbluewidgetencodeddownloadfile.html

    They get taken OFF the site to get their products. The sales site doesn't link them to the product hub, only the payment button confirmation and my email confirmation do.

    This doesn't prevent someone from sharing your DL page after they buy, but it does make it very hard to do a site search or a tricky search query for my download page.

    And soon, I'll be getting DL Guard or something similar to help offset the sharing of my DL pages too.
    {{ DiscussionBoard.errors[2158163].message }}
  • Profile picture of the author CDarklock
    Here's another bit for you.

    If your robots.txt says "Disallow: /myreallycoolstuff"... who isn't going to look there?

    Make a folder UNDER it with your download file in it. So people download from /myreallycoolstuff/adlkfjeoirhonecc/file.zip - and it will be harder to find.

    But yeah, get DLGuard or something. I just got PLR to something that secures Clickbank products, I might be able to hack something out of it. Haven't seen how good it is yet, though.
    Signature
    "The Golden Town is the Golden Town no longer. They have sold their pillars for brass and their temples for money, they have made coins out of their golden doors. It is become a dark town full of trouble, there is no ease in its streets, beauty has left it and the old songs are gone." - Lord Dunsany, The Messengers
    {{ DiscussionBoard.errors[2158227].message }}
  • Profile picture of the author Kezz
    There's always E-Junkie as well.

    You can set it up to work with ClickBank and act solely as your product delivery mechanism.

    Super cheap and in my experience an excellent service.
    {{ DiscussionBoard.errors[2158293].message }}
  • Profile picture of the author Nigel Greaves
    Damien's advice is good if you can't afford DLGuard* yet but I'd second Caliban's advice to get DLGuard from fellow Warrior Sam Stephens if you're serious about protecting your download page.

    Both the product and Sam's customer service are outstanding and worth every penny.

    Nigel

    *This link is not an affiliate link
    {{ DiscussionBoard.errors[2158526].message }}
  • Profile picture of the author Karan Goel
    To Damien:

    Look man. Any person can see your robots.txt file, and easily know what are you "hiding"? And then it makes our pages even more insecure and open.

    The best way is to have at least a few 100 fake directories in the robots.txt and the download directory randomly positioned. As soon as a person looks at the file, he'll close the browser and move on.

    Karan
    Signature
    Penalty Safe, Long Term, 100% Whitehat Backlinks
    Love your site? Then check out SafeSpokes!
    ~_~_~_~_~_~_~_~_~_~_~_~_~_~_~_~_~_~_~_
    karan996@irchiver.com karan997@irchiver.com
    {{ DiscussionBoard.errors[2158544].message }}
    • Profile picture of the author Damien Roche
      Originally Posted by Karan Goel View Post

      To Damien:

      Look man. Any person can see your robots.txt file, and easily know what are you "hiding"? And then it makes our pages even more insecure and open.

      The best way is to have at least a few 100 fake directories in the robots.txt and the download directory randomly positioned. As soon as a person looks at the file, he'll close the browser and move on.

      Karan
      Yeh, as C mentioned above - you can put your files in *another* directory you haven't revealed in robots.txt.

      I'll edit my post to reflect that. It's top advice.

      Of course, and like I did mention above several times, this is a short-term, quick fix. No product creator should do without adequate download management to protect their work.

      So, instead of improving the quick fixes, we may want to look at the long-term solutions.

      Make sense?

      Robots.txt isn't always listened to.
      Signature
      >> Seasoned Web Developer (CSS, JavaScript, PHP, Ruby) <<
      Available for Fixed Fee Projects and Hourly ($40/hr)
      {{ DiscussionBoard.errors[2158609].message }}
      • Profile picture of the author aandersen
        its not even a "fix" really, robots.txt was never intended to for the purpose of access control. Access control should be handled by your server (eg, authentication), a download managment solution, or something that was designed for the purpose of access control.

        however, put downloads/products aside this is good practice in general and people should be concious about it. any time you put anything on a web server you should think about weather or not you would want that piece of info showing in a google search

        what if you upload a script to your server that uses txt or xml data files. would you want that data to be accessiable through google? what about your images, videos, spreadsheets, etc.. really anything
        Signature

        signature goes here

        {{ DiscussionBoard.errors[2158701].message }}
        • Profile picture of the author Steven Wagenheim
          One word...DLGuard

          And Sam is the best when it comes to support.

          Best investment I ever made.
          {{ DiscussionBoard.errors[2158726].message }}
        • Profile picture of the author Damien Roche
          Originally Posted by aandersen View Post

          its not even a "fix" really, robots.txt was never intended to for the purpose of access control. Access control should be handled by your server (eg, authentication), a download managment solution, or something that was designed for the purpose of access control.

          however, put downloads/products aside this is good practice in general and people should be concious about it. any time you put anything on a web server you should think about weather or not you would want that piece of info showing in a google search

          what if you upload a script to your server that uses txt or xml data files. would you want that data to be accessiable through google? what about your images, videos, spreadsheets, etc.. really anything
          ..and that's exactly the point. As I mentioned in my post above, there is a 'high profile' warrior making this fatal mistake. I've informed his support, that's all I can do.

          But people are obviously still struggling with the fact that search engines WILL tell people where your stuff is, even the stuff you want to protect.

          Originally Posted by Steven Wagenheim View Post

          One word...DLGuard

          And Sam is the best when it comes to support.

          Best investment I ever made.
          Well, yeh, that's the solution. I really am trying NOT to come across with a problem and then..bam.."get DLGuard". eww just seems like an icky sales message or something.

          Of course, I KNEW people would suggest that so it's all good

          Top class product, fantastic support.
          Signature
          >> Seasoned Web Developer (CSS, JavaScript, PHP, Ruby) <<
          Available for Fixed Fee Projects and Hourly ($40/hr)
          {{ DiscussionBoard.errors[2158747].message }}
  • Profile picture of the author aandersen
    caliban is spot on with this one

    to add one more thing

    if you put it in /myreallycoolstuff/adlkfjeoirhonecc/file.zip

    and you
    Disallow: /myreallycoolstuff/

    PLEASE disable directory listing in /myreallycoolstuff/


    *edit : just saw karan's post, this will address that issue more effectivly
    Signature

    signature goes here

    {{ DiscussionBoard.errors[2158564].message }}
    • Profile picture of the author CDarklock
      Originally Posted by aandersen View Post

      PLEASE disable directory listing in /myreallycoolstuff/
      Easy way to do this... upload a blank index.html

      When you deliver products that need directory listing disabled, but your idiot customer doesn't know what .htaccess is - let alone that the dot in the beginning is important - a blank index.html does the job without having to explain things.

      There's also this annoying tendency on some hosts to completely disable .htaccess files, and you have to call them on the phone to get the relevant rules enabled, and your client isn't qualified so guess who gets to sit on hold for twenty minutes to do it?

      Blank index.html works a lot better.
      Signature
      "The Golden Town is the Golden Town no longer. They have sold their pillars for brass and their temples for money, they have made coins out of their golden doors. It is become a dark town full of trouble, there is no ease in its streets, beauty has left it and the old songs are gone." - Lord Dunsany, The Messengers
      {{ DiscussionBoard.errors[2159529].message }}
  • Profile picture of the author Underground SEO
    good advice there, thanks, I was doing this already but your thorough instructions will be extremely useful for many more people.
    {{ DiscussionBoard.errors[2158772].message }}
  • Profile picture of the author Chris-
    Would this work . . . make the download directory password protected (using cPanel "password protect directories"), then when people purchase the product, give them the password. And change the password frequently, in case someone posts it somewhere!

    I haven't tried that myself, but was thinking about the problem a while ago, and wondered if it would be a safe method to do it this way??? Presumably that stops both robots and searches?????

    Chris-
    Signature

    If you have a product/course, or list/following, I will help you do a WEBINAR JV for best results. Click here now . . .

    {{ DiscussionBoard.errors[2159328].message }}
  • Profile picture of the author felixg
    Hi I have been threading in the website design for some time now getting advice on website building. But I want to put a question here about this download page easy access.

    Would the software weblock be any use for this?

    Thank You
    Felix
    Signature

    No affiliate/MLM links in signatures, please.

    {{ DiscussionBoard.errors[2159608].message }}
  • Profile picture of the author Ben Holmes
    And for those search engines that don't play nice - stop naming your download access page "thankyou.html" or similar. It's all too easy to find such pages. Changing caps is a helpful (although not foolproof) way to help stop people passing along the download page too... like this: "/DatingGuideForYou.htm"

    For real fun, name the download page file randomly, like this: 'style.csr', then add this:

    AddType application/x-httpd-php .php .csr (if your download page is PHP)
    AddType application/x-httpd-php .html .csr (if your download page is HTML)

    to your .htaccess file. People are looking for html or php pages, they will ignore what they think is just css styling... (hmm... I'll bet '.dll' would be another great filetype to use!) Then add this:

    # protect the htaccess file
    <files .htaccess>
    order allow,deny
    deny from all
    </files>

    to your .htaccess file as well, to prevent anyone from seeing what you're doing.

    In other words, simply make it so difficult, that the hacker moves on to something easier to find. Or, of course, as others have already said, simply buy a good download manager. But these are a few tricks to try if you can't afford a good download manager yet.
    {{ DiscussionBoard.errors[2159641].message }}
  • Profile picture of the author searchnology
    I wouldn't recommend using the ROBOTS.TXT file since you can actually see that file. i.e. www.yourdomain.com/robots.txt

    What has worked for me is I use a referrer checking script on my download page that checks to see if the user's previous URL was the correct one from my site (or from whatever site) If they try to access they download page any way other than through that defined URL, they are redirected back to my homepage.

    Hmmm...seems like a simple (and cheap) download protection script could be a nice product for folks on this forum.
    Signature
    Google's Keyword Tool is Gone!..You will NEED this! - Watch Demo that Uncovers 1000s of KEYWORDS Other Tools Miss! »


    {{ DiscussionBoard.errors[2203900].message }}
    • Profile picture of the author dougp
      As mentioned several times already robots.txt was never meant to secure files. The best thing one can do is to use a secured download delivery system like dlguard. Sure, you can use one of those ebook security devices that logs the purchaser's ip and can allow individual's with different ips from viewing it, but there are always way around this. If multi-billion dollar companies like Microsoft cant prevent hackers from cracking their software and submitting it on torrent websites, do you think you really have a 100% solution to stop these rip off artists? With that being said, what you should do is make sure that your products are not being sold for profit by other individuals. A warrior put together a nice post on their blog that anyone interested can learn more about to combat piracy: Domaining Diva Blog Archive What You Can Do About “Paypal Dispute Ripoff Artists”
      {{ DiscussionBoard.errors[2203979].message }}

Trending Topics