Alternative to Robots.txt

3 replies
Is there an alternative to robots.txt for protecting download pages? I've used robots.txt files in the past to make sure my download pages from being indexed. Some SEO tools display robots.txt files.

Are there any better ideas?

Alex
#alternative #robotstxt
  • Profile picture of the author David
    .htaccess ?
    Signature

    David Bruce Jr of Frederick Web Promotions
    Lawyer Local SEO - |

    {{ DiscussionBoard.errors[1065640].message }}
  • Profile picture of the author trishworks4u
    there are products out there that help you secure your downloads. I haven't purchased one (yet) and don't know how they work. I believe Big Mike has something in his arsenal... incansoft.com.

    I think you can also "tell" Google through your webmaster tools account which pages you don't want indexed.
    {{ DiscussionBoard.errors[1065677].message }}
  • Profile picture of the author Jesus Perez
    <html>
    <head>
    <title>...</title>
    <META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">
    </head>

    All the red text above to your download page to keep the page from being indexed. It won't stop people from sharing it though. For that you need DLGuard.

    Obviously, there's nothing anywhere that can stop someone from uploading to Bit Torrent style sites.
    Signature

    {{ DiscussionBoard.errors[1065729].message }}

Trending Topics