4 replies
  • SEO
  • |
Why do we use Robots.txt file and how does it helps us and is it a right syntax

User-agent:*
Disallow:/crmservices.html
#file #robotstxt
  • Profile picture of the author andreajhon
    Robots.txt file prevents your page or file from spider/crawler access. If you need to hide something in your site from google bot, it is good to use robots.txt file.
    Robots.txt file must be saved as text file. It has two phrases one is user agent which is nothing but google bot, and disallow which disallows the access.

    user - agent: *
    Disallow: /page-yo-want-to-block.html if you want to hide all the pages comes under this then you can use *.
    {{ DiscussionBoard.errors[10157332].message }}
    • Profile picture of the author EugeneWHZ
      If you need to hide something in your site from google bot, it is good to use robots.txt file.
      Robots can ignore your /robots.txt. Especially malware robots that scan the web for security vulnerabilities, and email address harvesters used by spammers will pay no attention. The /robots.txt file is a publicly available file. Anyone can see what sections of your server you don't want robots to use.

      So if you want to "hide" something on your website you should look for alternative methods. For example you can restrict access by IP via .htaccess file.
      {{ DiscussionBoard.errors[10157475].message }}
  • Profile picture of the author nettiapina
    Originally Posted by Erbrains View Post

    Why do we use Robots.txt file and how does it helps us and is it a right syntax
    You're asking us why you're doing something. Shouldn't you be the one to answer that question? Why on earth would you ask us?
    Signature
    Links in signature will not help your SEO. Not on this site, and not on any other forum.
    Who told me this? An ex Google web spam engineer.

    What's your excuse?
    {{ DiscussionBoard.errors[10157383].message }}
  • {{ DiscussionBoard.errors[10158060].message }}

Trending Topics