When we use the on-page filtering on any page of product boxes, a unique URL is generated to reflect the filtered results. We use this in our internal link-building strategy on our Blog, to offer up a link to readers that shows a very specific result. For example, if we wanted to link to our entire offering of large-frame, 9mm handguns, the URL would look like this:
With the last section, "/MCategories+Complete-Handguns" being the starting point, and the rest of the URL representing the filters applied.
The problem is, those subsequent permutations appear to be blocked from crawlers by robots.txt, unless we are reading the information from Google incorrectly which says "Sitemap contains urls which are blocked by robots.txt" "Issues counted 1,427". All the URLs next to the number 1,427 are similar URLs with these advanced permutations based on filtering from what we can see. This seems to be blocking our ability to build link equity using our internal strategy. How can we resolve this issue?