Need help with my robots.txt file please...
Like many of my other money sites, they are hosted with Hostagator and in the form of Wordpress blog. Never have I encountered this problem before as i have never had to set up a robots.txt file manually.
For some reason when I try to submit my sitemap to google, i get a crawl error that my robots.txt file has the txt:
User-agent: *
Disallow: /
which because of the '/' after 'Disallow:', my complete site cannot be crawled.
I never set this up to be like that and i havent a clue how to change it. I decided to manually create a new robots.txt file with:
User-agent: *
Disallow:
so that my site should be crawled. I uploaded it to the '/public_html/mydomain dot com' directory about 10 hours ago, but it now lists all the page addresses which cannot be accessed and shows the following after testing the file in my webmaster tools section.
Url Googlebot Googlebot-Mobile mydomain dot com/ Blocked by line 2: Disallow: /
Detected as a directory; specific files may have different restrictions
Blocked by line 2: Disallow: /
Detected as a directory; specific files may have different restrictions
How on earth do i fix this as i need my site indexed asap by the Googlebot?
Thanks,
Si
-
Bruce Hearder -
Thanks - 1 reply
{{ DiscussionBoard.errors[1264028].message }}-
Si_P -
Thanks
{{ DiscussionBoard.errors[1266814].message }} -
-