Please solve my issue..
I got direct message from Google in GWT like this..
Over the last 24 hours, Googlebot encountered 39 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 30.5%.
I really don't knw how yo fix this error.. and, I have little fear about my performance of my website due to this..
And, I planned to disallow some URL's in my robots.txt (those URL's which shown in Crawl Errors section)
Please give me some valid suggestion to get rid out of this issue..!
Thanks in advance,