Only 1 URL index, 424 submitted?

53 replies
  • SEO
  • |
Was looking in my webmaster tools the other day and I see that only 1 url is in google's index yet 424 were submitted? I'm slightly confused as it has been this way for awhile. Not sure what to do next?
#424 #index #submitted #url
  • Profile picture of the author Tim3
    Originally Posted by jsmith2482 View Post

    Was looking in my webmaster tools the other day and I see that only 1 url is in google's index yet 424 were submitted? I'm slightly confused as it has been this way for awhile. Not sure what to do next?

    How old is the site?

    Have you set a noindex tag to the pages by mistake with an SEO plugin?

    Is the content unique?

    Have you fetched your site using GWT fetch tool?

    Have you blocked something in error with robots.txt or .htaccess?

    Have you checked your server logs for crawl errors?
    Signature

    {{ DiscussionBoard.errors[10024385].message }}
    • Profile picture of the author jsmith2482
      The site is 1 year old

      I use all in one seo pack so I did not set a noindex tag

      content is 100% unique

      I have fetched it a few times

      Nothing was blocked on my end

      There are no crawl errors

      I get two sitemap errors though

      -When we tested a sample of the URLs from your Sitemap, we found that some URLs were not accessible to Googlebot due to an HTTP status error. All accessible URLs will still be submitted.

      2-Some URLs listed in this Sitemap have a high response time. This may indicate a problem with your server or with the content of the page.

      Any idea?

      I have worked on my response time and have drastically improved it and other pagespeed factors
      {{ DiscussionBoard.errors[10024401].message }}
      • Profile picture of the author SEO Power
        Originally Posted by jsmith2482 View Post

        The site is 1 year old

        I use all in one seo pack so I did not set a noindex tag

        content is 100% unique

        I have fetched it a few times

        Nothing was blocked on my end

        There are no crawl errors

        I get two sitemap errors though

        -When we tested a sample of the URLs from your Sitemap, we found that some URLs were not accessible to Googlebot due to an HTTP status error. All accessible URLs will still be submitted.

        2-Some URLs listed in this Sitemap have a high response time. This may indicate a problem with your server or with the content of the page.

        Any idea?

        I have worked on my response time and have drastically improved it and other pagespeed factors
        If you haven't blocked any URLs via your robots.txt file and your content is unique, all you need do is wait. It takes time to index 400+ urls and Google doesn't guarantee that all your urls will be indexed. Make sure you've implemented proper internal linking for faster crawling.
        {{ DiscussionBoard.errors[10024530].message }}
        • Profile picture of the author jsmith2482
          Originally Posted by SEO Power View Post

          If you haven't blocked any URLs via your robots.txt file and your content is unique, all you need do is wait. It takes time to index 400+ urls and Google doesn't guarantee that all your urls will be indexed. Make sure you've implemented proper internal linking for faster crawling.
          I have proper internal linking and have been waiting for about 4 months to get more than 1 url indexed
          {{ DiscussionBoard.errors[10025244].message }}
          • Profile picture of the author Tim3
            Originally Posted by jsmith2482 View Post

            I have proper internal linking and have been waiting for about 4 months to get more than 1 url indexed

            Another thought just occurred,
            It looks like you know what you are doing, but are your canonicals correct?
            Signature

            {{ DiscussionBoard.errors[10025757].message }}
          • Profile picture of the author yukon
            Banned
            Originally Posted by Tim3 View Post

            Have you set a noindex tag to the pages by mistake with an SEO plugin?
            Originally Posted by jsmith2482 View Post

            I use all in one seo pack so I did not set a noindex tag
            You'll need to do better than that...

            View the problem live page HTML source code in your browser & verify there's not a noindex tag on the page/s.









            Originally Posted by jsmith2482 View Post

            I have proper internal linking and have been waiting for about 4 months to get more than 1 url indexed
            You're wasting time because there's no possible way it takes 4 months to index a webpage on Google SERPs unless you're blocking Googlebot from your page.
            {{ DiscussionBoard.errors[10025799].message }}
  • Profile picture of the author Tim3
    Hmm,
    I haven't come across that problem before, but others have, trying searching this in quotes...
    "Sitemap, we found that some URLs were not accessible to Googlebot due to an HTTP status error"

    Have you got Adsense on your site? if try putting an ad in the sidebar, or if not a some related YT video, see if Big G picks that up.
    Signature

    {{ DiscussionBoard.errors[10024437].message }}
  • Profile picture of the author SEWARRIOR
    1. check you site navigation is proper from homepage to deep page
    2. may be tweet your links, try some 20 links in tweet and see if they get crawled or nor
    3. Also google index quickly from social signal than webmaster tool. so try social signal for indexing
    {{ DiscussionBoard.errors[10026737].message }}
  • Profile picture of the author jsmith2482
    The site is The Physique Formula Diet.

    I have no idea how or why Googlebot would be blocked. I NEVER blocked it myself and I do not have any webmaster. How would I check to see if it's blocked?
    {{ DiscussionBoard.errors[10026800].message }}
    • Profile picture of the author yukon
      Banned
      Originally Posted by jsmith2482 View Post

      The site is The Physique Formula Diet.

      I have no idea how or why Googlebot would be blocked. I NEVER blocked it myself and I do not have any webmaster. How would I check to see if it's blocked?
      Looks like a problem on your end because I seriously doubt Google would index a robots.txt file If they had a problem with your domain.

      Plus, your Home page was cached yesterday (4/26/2015).
      {{ DiscussionBoard.errors[10026816].message }}
  • Profile picture of the author AndresNWD
    Seems totally weird.
    I don't think it is related but is this sitemap ok? http://physiqueformuladiet.com/sitemap.xml I've never seen one like this before.
    {{ DiscussionBoard.errors[10026831].message }}
  • Profile picture of the author yukon
    Banned
    Here's an internal page indexed & cached yesterday (4/26/2015), I'm not sure why it doesn't show up while doing a full site search.


    [edit]

    Here's another random internal page cached yesterday (4/26/2015).
    {{ DiscussionBoard.errors[10026836].message }}
  • Profile picture of the author Tim3
    Your site has 562 indexed pages in Google.com and .co.uk. so there is no problem as far as I can see.

    I recall having a problem with a sitemap plug-in too, that is now in the trash, suggest you build and submit my own using an online XML tool, end of problem.
    Signature

    {{ DiscussionBoard.errors[10026901].message }}
    • Profile picture of the author yukon
      Banned
      Originally Posted by Tim3 View Post

      Your site has 562 indexed pages in Google.com and .co.uk. so there is no problem as far as I can see.

      I recall having a problem with a sitemap plug-in too, that is now in the trash, suggest you build and submit my own using an online XML tool, end of problem.
      WHOA, now that's some weird stuff.

      I did a site: search a few minutes ago for OPs entire site & the only page showing was the sitemap file. Now I see 490 pages indexed. I tested both www & non-www on the site: search earlier, same results (nothing).

      If it was my site I would defiantly keep an eye on that site: search for a while.

      OP, fix that xml sitemap. There's also 2 404 pages on your site (run Screaming Frog & sort by error code column).
      {{ DiscussionBoard.errors[10026921].message }}
      • Profile picture of the author Tim3
        Originally Posted by yukon View Post

        WHOA, now that's some weird stuff.

        I did a site: search a few minutes ago for OPs entire site & the only page showing was the sitemap file. Now I see 490 pages indexed. I tested both www & non-www on the site: search earlier, same results (nothing).
        Strange indeed,
        I wonder if when you checked first, there was some sort of algo sorting/updating going in Big G's box of tricks but when I checked the updating was resolved and displaying the latest crawl results.
        Signature

        {{ DiscussionBoard.errors[10026948].message }}
  • Profile picture of the author jsmith2482
    Here's what webmaster tools tells me.

    1-When we tested a sample of the URLs from your Sitemap, we found that some URLs were not accessible to Googlebot due to an HTTP status error. All accessible URLs will still be submitted.

    2-Some URLs listed in this Sitemap have a high response time. This may indicate a problem with your server or with the content of the page.
    {{ DiscussionBoard.errors[10027530].message }}
  • Profile picture of the author AndresNWD
    This thread can be used as an SEO audit manual. Thanks Yukon.
    {{ DiscussionBoard.errors[10031471].message }}
  • Profile picture of the author yukon
    Banned
    There's still an issue with OPs site, he needs to decide If he wants to use www or non-www for indexing pages. Pick one & stick with it.
    {{ DiscussionBoard.errors[10031720].message }}
    • Profile picture of the author jsmith2482
      Originally Posted by yukon View Post

      There's still an issue with OPs site, he needs to decide If he wants to use www or non-www for indexing pages. Pick one & stick with it.
      Hey Yukon,
      Thanks. Can you explain that? How would I pick one?
      {{ DiscussionBoard.errors[10031918].message }}
      • Profile picture of the author yukon
        Banned
        Originally Posted by jsmith2482 View Post

        Hey Yukon,
        Thanks. Can you explain that? How would I pick one?
        Just pick one & stick with it, especially for internal links. It doesn't matter which one you pick though you do have 99% of your pages currently indexed without www so personally I would stick to non-www for that domain/site.

        Neither one matters (www vs non-www) as far as ranking pages but you need to stick to only one of them because as you can see it's a bit confusing for Google when listing pages in the SERPs.
        {{ DiscussionBoard.errors[10032038].message }}
        • Profile picture of the author jsmith2482
          Originally Posted by yukon View Post

          Just pick one & stick with it, especially for internal links. It doesn't matter which one you pick though you do have 99% of your pages currently indexed without www so personally I would stick to non-www for that domain/site.

          Neither one matters (www vs non-www) as far as ranking pages but you need to stick to only one of them because as you can see it's a bit confusing for Google when listing pages in the SERPs.
          Thanks. How do I "pick it" in google's eyes? How do I pick it in webmaster tools?

          Also, how do I delete my sitemap in webmaster tools and what plugin do you suggest to build my new sitemap?
          {{ DiscussionBoard.errors[10032315].message }}
        • Profile picture of the author jsmith2482
          Originally Posted by yukon View Post

          Just pick one & stick with it, especially for internal links. It doesn't matter which one you pick though you do have 99% of your pages currently indexed without www so personally I would stick to non-www for that domain/site.

          Neither one matters (www vs non-www) as far as ranking pages but you need to stick to only one of them because as you can see it's a bit confusing for Google when listing pages in the SERPs.
          Thanks, I figured out how to set the preference but Webmaster tools always moves the button back to "don't set preferred domain". It gives me this message
          Part of the process of setting a preferred domain is to verify that you own http://physiqueformuladiet.com/. Please verify http://physiqueformuladiet.com/.

          Yet it doesn't tell me how to verify that I own the url
          {{ DiscussionBoard.errors[10042011].message }}
          • Profile picture of the author yukon
            Banned
            Originally Posted by jsmith2482 View Post

            Thanks, I figured out how to set the preference but Webmaster tools always moves the button back to "don't set preferred domain". It gives me this message
            Part of the process of setting a preferred domain is to verify that you own The Physique Formula Diet. Please verify The Physique Formula Diet.

            Yet it doesn't tell me how to verify that I own the url
            Did you copy/paste the Webmaster Tools verification code in your website HTML <head>?
            {{ DiscussionBoard.errors[10042050].message }}
            • Profile picture of the author jsmith2482
              Originally Posted by yukon View Post

              Did you copy/paste the Webmaster Tools verification code in your website HTML <head>?
              Okay I figured out how to use the non-www. domain. But now I get 24 sitemap errors. I'm using the google XML sitemap which is the best reviewed in the wordpress plugin section.

              Not sure what to do.

              Here's the error
              We encountered an error while trying to access your Sitemap. Please ensure your Sitemap follows our guidelines and can be accessed at the location you provided and then resubmit. General HTTP error 404 not found
              {{ DiscussionBoard.errors[10042962].message }}
              • Profile picture of the author yukon
                Banned
                Originally Posted by yukon View Post

                That's a typical Wordpress Arne Brachhold plugin xml sitemap.

                [edit]

                All the pages in that xml sitemap are dead links. That xml sitemap is useless.

                Example:
                Originally Posted by jsmith2482 View Post

                Okay I figured out how to use the non-www. domain. But now I get 24 sitemap errors. I'm using the google XML sitemap which is the best reviewed in the wordpress plugin section.

                Not sure what to do.

                Here's the error
                We encountered an error while trying to access your Sitemap. Please ensure your Sitemap follows our guidelines and can be accessed at the location you provided and then resubmit. General HTTP error 404 not found
                That's why I said to fix all the dead links in the xml sitemap.

                Google eventually found them...
                {{ DiscussionBoard.errors[10043208].message }}
  • Profile picture of the author johnyjee
    In order to index your page there must be unique content on the page. If your page has nothing other than URL and few duplicate content then Google will not bother to crawl your backlinks irrespective of how many times you submit your backlinks to Google.
    {{ DiscussionBoard.errors[10031801].message }}
    • Profile picture of the author yukon
      Banned
      Originally Posted by johnyjee View Post

      In order to index your page there must be unique content on the page. If your page has nothing other than URL and few duplicate content then Google will not bother to crawl your backlinks irrespective of how many times you submit your backlinks to Google.
      Stop the madness...

      Even duplicate internal pages still get indexed on Supplemental SERPs. Traffic won't find those specific pages but they're still indexed.
      {{ DiscussionBoard.errors[10031810].message }}
  • Profile picture of the author AndresNWD
    You just need to redirect from www to non-www. This guide should help:
    regex - Generic htaccess redirect www to non-www - Stack Overflow
    {{ DiscussionBoard.errors[10039973].message }}
  • Profile picture of the author Hostnameclub
    Hi,

    Please check be careful. Sometimes we enabled both of Wordpress SEO by Yoast and Google XML Sitemap. Both of them has sitemap generator, and usually Google XML Sitemap works better, so you must disable Sitemap function from a Wordpress SEO by Yoast..
    {{ DiscussionBoard.errors[10043098].message }}
  • Profile picture of the author Jeff Willy
    Have you tried any legit indexing service sites like Indexification or Lindexed to potentially index your website?
    {{ DiscussionBoard.errors[10043319].message }}
  • Profile picture of the author jsmith2482
    So I uploaded a NEW sitemap plugin and I STILL get the same error messages. Someone told me "just get rid of the error pages" and that's great but I have NO idea how the error pages were created and I can't find them on my server or in my site to delete.

    For example
    http://physiqueformuladiet.com/sitem...st-2013-11.xml

    That open to a 404 page, I don't know how to delete that
    {{ DiscussionBoard.errors[10050197].message }}
  • {{ DiscussionBoard.errors[10057424].message }}
  • Profile picture of the author MikeFriedman
    Just delete the sitemap and get rid of the sitemap plugin.

    For a site of only 400 pages, you do not need a sitemap. If you do need one, that means your navigation is lousy.
    Signature

    For SEO news, discussions, tactics, and more.
    {{ DiscussionBoard.errors[10058890].message }}
  • Are you checking from the webmaster tool?
    {{ DiscussionBoard.errors[10060163].message }}
  • Profile picture of the author Hudson White
    You can generate and upload the sitemap xml as well as HTML again. Also, if you have done nay changes on your website then again submit your website for index in Google webmasters as Fetch as Google.
    {{ DiscussionBoard.errors[10060494].message }}
  • {{ DiscussionBoard.errors[10062984].message }}
  • Profile picture of the author deepakrajput
    You have not submit your website in right way OR you have added your website pages in your sitemap.xml file.
    {{ DiscussionBoard.errors[10063223].message }}
  • Profile picture of the author jsmith2482
    I just ran a google safe browsing diagnostic on my site. It gives a category titled "What happened when Google visited this site?"

    The result
    Google has not visited this site within the past 90 days.

    I'd imagine that this means they aren't crawling my site and recognizing all the new articles I'm putting up.
    {{ DiscussionBoard.errors[10121608].message }}

Trending Topics