Webmaster tools question about url errors

by 10 replies
12
I had a store section on my site that I deleted around 3-5-15 so got a bunch of 404s yadda yadda...

I am still getting 404 errors, that's okay.

But what seems weird to me is that the date first detected on these new 404s says 4-26-15 and the linked from url is also 404 gone.

So how can Googlebot detect a 404 page linked from a page that doesn't exist?

And should I be concerned about that the date first detected says a few days ago when the pages have been gone for almost two months?

Thanks!
#search engine optimization #errors #question #tools #url #webmaster
  • Banned
    In that case the problem isn't a missing page.

    The problem is a link pointing to a page that doesn't exist, even If that link/URL is only a typo.

    Run Screaming Frog & look for broken internal links (404s). If it's not an internal link problem then it's a broken backlink from another domain, catch the 404/URL & do a 301 redirect to a live page.
    • [ 1 ] Thanks
  • I personally prefer Xenu over screaming frog, because it is faster and depending on the size of your site you may not be able to fully analyze it.
    • [ 1 ] Thanks
    • [1] reply
    • Banned
      I prefer Screaming Frog because IMO it's far easier to drill down to problems & easier to sort data. I haven't had any problems crawling sites.
      • [1] reply
  • Oh, I should also have said that the page linking to the page that does not exist is also on my site and was also deleted on 3-5-15. So there is no page pointing to the missing page. Both gone.

    That's why I don't understand. How is Googlebot crawling pages that do not exist? And why is it reporting pages that do not exist linked from pages that do not exist? And why is the date first detected a few days ago when the pages have been gone since 3-5?
  • There are no links pointing to the pages that do not exist on my site. I deleted the entire folder and never linked from anywhere else on the site.

    Webmaster tools shows the urls that the 404 pages are "linked from" they are all pages that do not exist.

    ...and today I am still getting more of these showing up with the first detected date being a few days ago - even though the entire folder was deleted back in March.

    It says "Googlebot couldn't crawl this URL because it points to a non-existent page."

    But it is crawling non-existent pages - it says it found the link by crawling a non-existent pages.
    • [1] reply
    • Banned
      Understand that Google digs, they dig deep when it comes to sites/host. There's almost always a backdoor (hidden links) especially when it comes to a CMS (ex: Wordpress).

      Difficult to suggest anything else without knowing the problem domain/URLs.
      • [ 1 ] Thanks
      • [1] reply
  • First of all update your sitemap. Then resubmit your website in Google webmasters.

    Also you can implement 301-redirect to the pages which are giving 404 not found error and redirect them to the relevant page of your website.
    • [ 1 ] Thanks

Next Topics on Trending Feed