Question Regarding Crawl Delay in Robot.txt

5 replies
  • SEO
  • |
Hello,

What is crawl delay in Robot.txt ?
I checked my site webmaster tool robot.txt and found that..

User-agent: *
Crawl-delay: 1
Disallow: /wp-content/plugins/
Disallow: /wp-admin/

Is it bad for search engine traffic?
I dont know about crawl delay . What does it mean and what is the right format for it.
Clciktrends - Unleash the Digital Trends my site has around 60+ posts.

Any advice to solve this issue?

Thanks
#crawl #delay #question #robottxt
  • Profile picture of the author alysehubbard
    yes, as you have put * in the start of the code itself, not even a single robot could not be able to crawl any data present in your website. So, it is better to use no index and no follow to avoid the search engine ranking drawbacks.
    {{ DiscussionBoard.errors[10206866].message }}
  • Profile picture of the author uditsh
    Originally Posted by robbert View Post

    What is crawl delay in Robot.txt ?
    I checked my site webmaster tool robot.txt and found that..

    User-agent: *
    Crawl-delay: 1
    Disallow: /wp-content/plugins/
    Disallow: /wp-admin/
    I don't know much about Crawl delay parameter and never used this. But as i found over the web, Crawl delay is a special directive that instructs a crawler to wait for number of second between successive requests to the same server.
    In your case, 'Crawl-delay: 1' giving instruct to wait for 1 second.

    According to few sources, Googlebot ignore Crawl delay rule, and rather than this, it use crawl rate (Change Googlebot crawl rate).

    Google Webmaster Forum: I received a warning from GWT saying that the googlebot will ignore my crawl delay rule (found in my robots.txt file).
    Wiki Article: Several major crawlers support a Crawl-delay parameter, set to the number of seconds to wait between successive requests to the same server:
    Yandex Webmaster: If the server is overloaded and does not have enough time to process downloading requests, use the Crawl-delay directive. It enables you to specify the minimum interval (in seconds) for a search robot to wait after downloading one page, before starting to download another.
    Originally Posted by robbert View Post

    what is the right format for it.
    I am not totally sure about this format of using single crawl delay for all user agents (User-agent: *). Rather than this format, IMO, you should use separate crawl delay parameter for separate user agent because this parameter is not recognized by every user-agent.

    For Example:

    # Microsoft Search Engine Robot
    User-agent: msnbot
    Crawl-delay: 1

    #Yandex Search Engine Robot
    User-agent: Yandex
    Crawl-delay: 1
    {{ DiscussionBoard.errors[10206934].message }}
    • Profile picture of the author uditsh
      Originally Posted by robbert View Post

      What should be the correct format of it?

      Either, use this parameter for every specified bot.
      Originally Posted by uditsh View Post


      For Example:

      # Microsoft Search Engine Robot
      User-agent: msnbot
      Crawl-delay: 1

      #Yandex Search Engine Robot
      User-agent: Yandex
      Crawl-delay: 1
      OR

      Use this for all bots
      User-agent: *
      Crawl-delay: 1
      Note that, Both are right format, but in 2nd format, don't expect it's working with every bots.

      Originally Posted by robbert View Post

      Is that format is bad for search engine traffic?
      No, If you are receiving bit aggressive bots hits then use this parameter to slow down the robots.

      Additional Info-
      Bing Webmaster: As so many factors are involved in the crawl rate, there is no clear, generic answer as to whether you should set a crawl delay.
      {{ DiscussionBoard.errors[10209955].message }}
  • Profile picture of the author robbert
    Thanks for this great explanation. but i am still confused. What should be the correct format of it? Is that format is bad for search engine traffic?
    {{ DiscussionBoard.errors[10208293].message }}
  • Profile picture of the author saeedrk
    I have same issue with my wordpress blog, I wanna ask you guys is their any affect of this robots.txt on our blog ?
    Signature

    Well blogger and know about IT (information technology) .

    Work on Top Information Technology Ideas

    {{ DiscussionBoard.errors[10250306].message }}

Trending Topics