I installed google xml sitemap gen plug in and submitted my sitemap. Then I added my site to google webmaster tools and tried to submit the sitemap through their tools.
I got a message "URL restricted by robots.txt" which says that I had them blocked. So I downloaded a plug in called irobotseo.txt which shows you the robot.txt file.
I went to the settings and viewed the robot.txt file and deleted everything in it since I do not know how to edit it to allow search engines to crawl.
I freaked out and deleted the irobot plug in and tried to resubmit my sitemap through webmaster tools and failed again so I deleted my sitemap.
I went to the crawler access tool and tested the file and got this message "Blocked by line 2: Disallow: / Detected as a directory; specific files may have different restrictions"
So my question is how do I allow the search engines to crawl my website. And if I have to modify it, what do I type and where can I type it? I am using wordpress with flexsqueeze theme hosted by hostgator.