Only 1 URL index, 424 submitted?

by 53 replies
60
Was looking in my webmaster tools the other day and I see that only 1 url is in google's index yet 424 were submitted? I'm slightly confused as it has been this way for awhile. Not sure what to do next?
#search engine optimization #424 #index #submitted #url

  • How old is the site?

    Have you set a noindex tag to the pages by mistake with an SEO plugin?

    Is the content unique?

    Have you fetched your site using GWT fetch tool?

    Have you blocked something in error with robots.txt or .htaccess?

    Have you checked your server logs for crawl errors?
    • [1] reply
    • The site is 1 year old

      I use all in one seo pack so I did not set a noindex tag

      content is 100% unique

      I have fetched it a few times

      Nothing was blocked on my end

      There are no crawl errors

      I get two sitemap errors though

      -When we tested a sample of the URLs from your Sitemap, we found that some URLs were not accessible to Googlebot due to an HTTP status error. All accessible URLs will still be submitted.

      2-Some URLs listed in this Sitemap have a high response time. This may indicate a problem with your server or with the content of the page.

      Any idea?

      I have worked on my response time and have drastically improved it and other pagespeed factors
      • [1] reply
  • Hmm,
    I haven't come across that problem before, but others have, trying searching this in quotes...
    "Sitemap, we found that some URLs were not accessible to Googlebot due to an HTTP status error"

    Have you got Adsense on your site? if try putting an ad in the sidebar, or if not a some related YT video, see if Big G picks that up.
  • [DELETED]
  • 1. check you site navigation is proper from homepage to deep page
    2. may be tweet your links, try some 20 links in tweet and see if they get crawled or nor
    3. Also google index quickly from social signal than webmaster tool. so try social signal for indexing
  • The site is The Physique Formula Diet.

    I have no idea how or why Googlebot would be blocked. I NEVER blocked it myself and I do not have any webmaster. How would I check to see if it's blocked?
    • [1] reply
    • Banned
      Looks like a problem on your end because I seriously doubt Google would index a robots.txt file If they had a problem with your domain.

      Plus, your Home page was cached yesterday (4/26/2015).
  • Seems totally weird.
    I don't think it is related but is this sitemap ok? http://physiqueformuladiet.com/sitemap.xml I've never seen one like this before.
    • [1] reply
  • Banned
    Here's an internal page indexed & cached yesterday (4/26/2015), I'm not sure why it doesn't show up while doing a full site search.


    [edit]

    Here's another random internal page cached yesterday (4/26/2015).
    • [1] reply
    • So you mean google acknowledged those specific pages yesterday?

  • Your site has 562 indexed pages in Google.com and .co.uk. so there is no problem as far as I can see.

    I recall having a problem with a sitemap plug-in too, that is now in the trash, suggest you build and submit my own using an online XML tool, end of problem.
    • [1] reply
    • Banned
      WHOA, now that's some weird stuff.

      I did a site: search a few minutes ago for OPs entire site & the only page showing was the sitemap file. Now I see 490 pages indexed. I tested both www & non-www on the site: search earlier, same results (nothing).

      If it was my site I would defiantly keep an eye on that site: search for a while.

      OP, fix that xml sitemap. There's also 2 404 pages on your site (run Screaming Frog & sort by error code column).
      • [1] reply
  • Here's what webmaster tools tells me.

    1-When we tested a sample of the URLs from your Sitemap, we found that some URLs were not accessible to Googlebot due to an HTTP status error. All accessible URLs will still be submitted.

    2-Some URLs listed in this Sitemap have a high response time. This may indicate a problem with your server or with the content of the page.
  • This thread can be used as an SEO audit manual. Thanks Yukon.
  • Banned
    There's still an issue with OPs site, he needs to decide If he wants to use www or non-www for indexing pages. Pick one & stick with it.
    • [1] reply
    • Hey Yukon,
      Thanks. Can you explain that? How would I pick one?
      • [1] reply
  • In order to index your page there must be unique content on the page. If your page has nothing other than URL and few duplicate content then Google will not bother to crawl your backlinks irrespective of how many times you submit your backlinks to Google.
    • [1] reply
    • Banned
      Stop the madness...

      Even duplicate internal pages still get indexed on Supplemental SERPs. Traffic won't find those specific pages but they're still indexed.
  • You just need to redirect from www to non-www. This guide should help:
    regex - Generic htaccess redirect www to non-www - Stack Overflow
  • Hi,

    Please check be careful. Sometimes we enabled both of Wordpress SEO by Yoast and Google XML Sitemap. Both of them has sitemap generator, and usually Google XML Sitemap works better, so you must disable Sitemap function from a Wordpress SEO by Yoast..
  • Have you tried any legit indexing service sites like Indexification or Lindexed to potentially index your website?
    • [1] reply
    • Nope, thought those were considered bad?
  • So I uploaded a NEW sitemap plugin and I STILL get the same error messages. Someone told me "just get rid of the error pages" and that's great but I have NO idea how the error pages were created and I can't find them on my server or in my site to delete.

    For example
    http://physiqueformuladiet.com/sitem...st-2013-11.xml

    That open to a 404 page, I don't know how to delete that
  • Hey All,
    Still having the same issues. Any advice?
    Thanks
    • [1] reply
  • Just delete the sitemap and get rid of the sitemap plugin.

    For a site of only 400 pages, you do not need a sitemap. If you do need one, that means your navigation is lousy.
    • [2] replies
    • Two things I always did was an XML sitemap because people said to for Google. Another was an HTML sitemap for visitors. I've stopped using both. Google will eventually find your pages as long as they are not orphaned.
      • [1] reply
    • Delete the sitemap from webmaster tool?
      • [1] reply
  • Are you checking from the webmaster tool?
    • [1] reply
    • Yes I am checking from webmaster tools. Why?
  • You can generate and upload the sitemap xml as well as HTML again. Also, if you have done nay changes on your website then again submit your website for index in Google webmasters as Fetch as Google.
  • Any idea how I'd get rid of the 404 error?
  • You have not submit your website in right way OR you have added your website pages in your sitemap.xml file.
  • I just ran a google safe browsing diagnostic on my site. It gives a category titled "What happened when Google visited this site?"

    The result
    Google has not visited this site within the past 90 days.

    I'd imagine that this means they aren't crawling my site and recognizing all the new articles I'm putting up.

Next Topics on Trending Feed