Googlebot can't access my site
- SEO |
Over the last 24 hours, Googlebot encountered 3 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 100.0%. |
it's weird because it was working well last week. so i am sure something changed in their configuration and this issue emerged.
I suspect that they set the security settings to highest level after that server outage and it prevents everything (including google bot) from crawling my site.
any advice will be appreciated.
We customize your Blog, eBook, Press Release and Sale Copy content with your message.