Robots.txt Best Practice
- SEO |
Supposedly it's so search engines can find it easier, but if you submit the sitemap to Google Search Console and Bing Webmaster Tools, isn't it sort of duplicating your work?
Is there any downside? I always thought of the Robots.txt file as an “anti sitemap”, basically telling search engines what pages on your site you don’t want them to find.