Robots.txt file for Blog
- SEO |
My blog is hosted on HubSpot CMS. My blog url is like http://blog.example.com. There are some duplicate content issues for tag pages.
e.g.
http://blog.example.com/blog/?Tag=Quality
http://blog.example.com/blog/?Tag=HIE
HubSpot CMS is incompetent for Canonical tag, hence I have tried to block these pages into robots.txt:
User-agent: *
Disallow: /blog/?Tag=Quality
Disallow: /blog/?Tag=HIE
But it is not giving any positive result.
Please tell me how can get a rid of these pages?
Thanks in advance for your help.