Do search engines index secure webpages?

8 replies
I have done some research and it appears that they do, but I wanted to get some other opinions. On one of my websites I have secure pages for the ordering, but switching users between secure and unsure pages gets to be a pain sometimes so I was thinking about just making all pages on the website secure.
#engines #index #search #secure #webpages
  • Profile picture of the author Steven Carl Kelly
    Yes, they do. I am familiar with several sites that can ONLY be accessed through port 443 and all of them are indexed.
    Signature
    Read this SURPRISING REPORT Before You Buy ANY WSO! Click Here
    FREE REPORT: Split Test Your Landing Pages the Easy Way
    {{ DiscussionBoard.errors[329270].message }}
  • Profile picture of the author Sylvan
    Banned
    I think it depends on your cookies settings. how secure is your page? I always disallow access to my download directories withot the proper link and disallow spides to index those pages.
    {{ DiscussionBoard.errors[329370].message }}
  • Profile picture of the author Mike Bogowski
    They will index secure pages just fine.

    You'll need to consider the issues of running your entire site through https though. This can be a real CPU killer if you are getting a lot of traffic.
    {{ DiscussionBoard.errors[329407].message }}
  • Profile picture of the author onera
    Google will index anything that it can reach unless your tell the google crawler not to index a directory using robots.txt file.
    Signature
    Post Unlimited Free Classified Ads in the Oldest Online Classified Site.

    Build Links, Increase Traffic, and Make Money by Creating MySpace type Free Pages For Linking to Digg Style Submissions
    {{ DiscussionBoard.errors[329415].message }}
  • Profile picture of the author Curt Dillion
    I'm aware of the robots.txt file being used to stop indexing. Can anyone tell me what to put in it to stop search engines from indexing an entire site? I have one site that I use just for my thank you pages and downloading my products.
    {{ DiscussionBoard.errors[329457].message }}
    • Profile picture of the author Lavinco
      Originally Posted by Curt Dillion View Post

      I'm aware of the robots.txt file being used to stop indexing. Can anyone tell me what to put in it to stop search engines from indexing an entire site? I have one site that I use just for my thank you pages and downloading my products.
      Make sure you want to do this for sure first.

      Simply create a file named robots.txt and add
      User-agent: *
      Disallow: /
      {{ DiscussionBoard.errors[329481].message }}
      • Profile picture of the author Curt Dillion
        Originally Posted by Lavinco View Post

        Make sure you want to do this for sure first.

        Simply create a file named robots.txt and add
        User-agent: *
        Disallow: /
        Thank you.
        {{ DiscussionBoard.errors[330434].message }}
        • Profile picture of the author Lavinco
          Originally Posted by Curt Dillion View Post

          Thank you.
          You're welcome.
          To better explain what this is doing, whould be where the / is represents the entire root directory.

          User-agent: *
          Disallow: /

          If you specified a subfolder like

          User-agent: *
          Disallow: /images/


          then you sre telling the bots to stay out of the /images/ folder. When you simply add the / alone, this instructs the bot to stay out of the entire site.
          {{ DiscussionBoard.errors[333631].message }}

Trending Topics