Should I implement disallow rules in crawler page not found url report?

by phMig
0 replies
  • SEO
  • |
Hi guys, Im just confused either to block crawler in robot file. Im having a classified website with dynamic content created but not behind login. Since last month Im getting a Page Not Found report from crawler so Im adding a disallow rule in robot file here are sample urls reported not found:

http://mydomain.com/search/category,...ern,/iPage,12/

http://mydomain.com/jobs/office-cler...edium=facebook

http://mydomain.com/jobs/office-cler...edium=facebook

Those are valid links in the sense that it will redirect user to correct page. Having said that, here are some points I need to cliear

Why crawler return a Page Not Found report?
Will it be healthy to leave the error?
If Im going to implement the disallow rule, am I still getting ads on url in question?

TIA.

pmMig.
#crawler #disallow #found #implement #page #report #rules #url

Trending Topics