How to get rid of 404, category page and tag pages from Indexing

by Maddy5 11 replies
Hello Warriors

I have a problem and it is that I am working on a e-commerce site which has total 30 pages but the problem is that google is showing 157 pages that are currently indexing for that site.

There are lots of tag pages, category pages and 404 pages are indexed I tried top remove those page through search consoles remove url section. It was gone for few days but now its once again indexed

So what can I do in this situation please help
#search engine optimization #404 #category #indexing #page #pages #rid #tag
Avatar of Unregistered
  • Profile picture of the author expmrb
    If your site is new then it may take some time to get realized by Google. Give it a week or two.
    Signature
    SEO Motionz Forum - A Digital Marketing Forum
    Forum Management & Promotion, SEO Tips, Money Making tips etc.
    {{ DiscussionBoard.errors[11040042].message }}
    • Profile picture of the author Maddy5
      Its not a new site and I have done that about 3 weeks ago
      {{ DiscussionBoard.errors[11040051].message }}
      • Profile picture of the author RyanJoseph
        just try to edit Robots.txt file of the website,by editing robots .txt file you can enter the pages which are not to be indexed by google and it can be done seperately for individual pages by disallow code .

        "If you are not familiar with editing Robots.txt file then get assistance from your programmer"
        {{ DiscussionBoard.errors[11040087].message }}
        • Profile picture of the author Maddy5
          I am familiar with the tool and had used it in the past. My only concern about using it that I am not sure that this is the right method of doing it.
          {{ DiscussionBoard.errors[11040095].message }}
  • Profile picture of the author johnny07
    Remove the tags and the categories from the backend. Check your robots.txt file and fix it.
    {{ DiscussionBoard.errors[11040103].message }}
  • Profile picture of the author ruther
    By robots.txt file, you can control that particular pages for stop indexing and crawling.
    {{ DiscussionBoard.errors[11040106].message }}
  • Profile picture of the author SamuelKumar
    Create Robots.txt file are using inhibit the particular page or category.
    {{ DiscussionBoard.errors[11040168].message }}
  • Profile picture of the author MikeFriedman
    Add a noindex tag to the pages you do not want indexed. Google will not work instantly on removing them, but they will eventually be removed from the index.
    Signature
    SEO Myths
    SEO, PPC, Geo-Fencing, and Social Media Marketing Services - Get a Free Quote
    {{ DiscussionBoard.errors[11040189].message }}
  • Profile picture of the author JamieReynolds
    Besides adding a noindex tag to the pages you do not want indexed, you can do a 301 redirect of the error pages.

    Install the 301 redirect plugin, follow this step by removing the errors on Search Console.

    Then wait a few days for the updates to kick in.
    Signature

    Tasty and captivating book descriptions -
    https://goo.gl/53LbXr

    {{ DiscussionBoard.errors[11040338].message }}
  • Profile picture of the author paulgl
    The robots.txt is the correct answer. You can't keep playing whack-a-mole.

    Those pages are generated, then if google is crawling they will index them.

    Simple code, but you may need to look for an example.

    That would also solve the 404, but have a custom 404 page.
    You can't eliminate 404 errors no matter how hard you try.

    Paul
    Signature

    If you were disappointed in your results today, lower your standards tomorrow.

    {{ DiscussionBoard.errors[11040460].message }}
  • Profile picture of the author Sclark
    You should simply make sure that there are no links on your website pointing to those 404 pages, and restrict the valid pages from indexing if you want them to be removed from Google`s index.

    I`d recommend you WebsiteAuditor to quickly handle that: it can collect all indexed pages along with all internal links to them, detect any broken links, or separately collect all the pages with specific words in URLs like 'tag' or 'category%name'. Also, there`s a cool robots.txt generator to add a disallow rule for a bunch of pages at once.

    There`s no way to force Google to remove the pages from indexes right away, but once it re-visits them - they should get de-indexed.
    {{ DiscussionBoard.errors[11040478].message }}
Avatar of Unregistered

Trending Topics