Robots.txt & internal filter links blocked

3 replies
  • SEO
  • |
Our internal links appear to be blocked by robots.txt when they are permutated beyond category or brand pages. We did this so that Google wouldn't index all the filters we had on the left side navigation. However, we seem to now have an issue. Example below:

When we use the on-page filtering on any page of product boxes, a unique URL is generated to reflect the filtered results. We use this in our internal link-building strategy on our Blog, to offer up a link to readers that shows a very specific result. For example, if we wanted to link to our entire offering of large-frame, 9mm handguns, the URL would look like this:
https://www.company.com/CaliberGauge...plete-Handguns

With the last section, "/MCategories+Complete-Handguns" being the starting point, and the rest of the URL representing the filters applied.
The problem is, those subsequent permutations appear to be blocked from crawlers by robots.txt, unless we are reading the information from Google incorrectly which says "Sitemap contains urls which are blocked by robots.txt" "Issues counted 1,427". All the URLs next to the number 1,427 are similar URLs with these advanced permutations based on filtering from what we can see. This seems to be blocking our ability to build link equity using our internal strategy. How can we resolve this issue?
#blocked #filter #internal #links #robotstxt
Avatar of Unregistered
Avatar of Unregistered

Trending Topics