Disallow Robots from Crawling All Html Pages

1 replies
Is the code below correct to disallow all robots from crawling all html pages within my site:

User-agent: *
Disallow: /.html

Allow: /wp-admin/admin-ajax.php

Sitemap: (mydomain)/sitemap.xml

Thanks in advance
#crawling #disallow #html #pages #robots
Avatar of Unregistered
  • Profile picture of the author jimmy adam
    @vema123 Strange you are using Wordpress still you have .html pages, Yes you need to use this particular code in robots.txt
    User-agent: *
    Disallow: /.html
    Also I will prefer to use 301 redirections for the relevant page as well.
    {{ DiscussionBoard.errors[11533427].message }}

Trending Topics