Googlebot and robots.txt

3 replies
  • SEO
  • |
I was just in my Google Webmasters account adding my URL and sitemap, and I noticed that when I "tested" it, it said something about not being able to browse my site because the robots.txt had "Disallow: /".

If you look through the configuration options there is one where you want your site to be crawled, which will change it to "Allow: /" (or whatever the syntax is).

My question: typically do you want an "allow" or "disallow"? This is a Wordpress site and I did nothing to change anything, so I guess Disallow was on by default. My first thought would be that "Allow" is better.

Thoughts?
#googlebot #robotstxt

Trending Topics