Indexing 30k Links

by 19 replies
22
Guys, I am looking for an automated solution that can index anywhere from 30-50k links. I know there are several tools that will index small amounts of links, but I need something that will index large amounts for me.


Is there something like this out there?


I am not interested in indexing tools that will do 250 to 500 at a time as that will take forever. Any advice or suggestions would be greatly appreciated.
#search engine optimization #30k #indexing #links
  • It seems that you want to get all the backlinks at a time, right?
    This is not a good practice. Massive number of backlinks at once can look suspicious for Google. That's why the services you find are just creating a small number at a time.
    • [1] reply
    • Banned
      [DELETED]

    • I understand your concern, but trust me when I say I know what I am doing.

      I just need to locate a tool that can do it on a large level and automated.
  • I have an idea on how I can do it, but was looking for other ideas as well.


    My idea is to create a page that has 30k links on it and then work to get that indexed. Getting a page of this nature will definitely be hard to get indexed, but it will be a lot easier then some of the other methods I have seen out there.
    • [2] replies
    • I think this is probably your best bet. Or, breaking them up into smaller blocks of one to five thousand links and pinging them separately or otherwise getting them indexed individually.

      Good luck.
    • Sorry man, but that's just a terrible idea on so many levels. Not to mention...it'll be far from effective.

      I won't bother citing the reasons that it's a bad idea.

      FYI - Also, Getting a page indexed, G-bot following said links on page, and caching them....well, they're not mutually inclusive.

      G. caching a page with 5 outgoing links...in no way makes those 5 urls a foregone conclusion for caching/indexation, let alone thousands on a single page being followed and cached. Just say'n



  • How much time and what is your definition of indexed? What kind of links?
  • My definition of indexed is just showing up when you check yahoo backlinks. From a time standpoint the faster the better for me.
    • [1] reply

    • You want a service? you want to buy the software and setup the sites needed to get it done as I have?

      3 Links per post x 12 Posts x 10 sites in a cluster x 15 clusters.... Per Day

      Actual math is 3 x 12 x 15 per day = 540

      30k / 540 = 55 days

      Depending on the type of links - we are getting near 65-70% of the pages that the links are on cached by google.

      Yahoo site explorer doesnt really do a whole lot to rank you in google
      • [1] reply
  • I need thousands of links indexed on a daily basis as I have A LOT of site so 540 a day isn't going to work. I am more looking for a tool I guess.
    • [1] reply
    • It is a tool and a service ....

      It will work - but obviously not at the pace you desire. Im not sure there is currently another automated indexation method that could BOOST/Wash/Rinse/Index more than 1,000 a day.

      I could scale my system to post 1000 per day in a bout 2 weeks time. Would require some investment and time though. The issue is - you need actively crawled properties - many of them ... and you need content to mix in with the links that google cares to cache ... and you need to be disseminating it automatically from a range of IP's. Additionally it needsto be done in a fashion as to not piss off high level IT people and or Google.

      It a delicate balance - if longevity is desired.

      As a starting point ...

      25 IP's - 25 subdomains for the software ... then a network of sites exceeding 450 ... maybe more. [ I have 3 x that available ]

      I could show you how to scale this up and do it yourself .... its work to setup but a thing of beauty when its going ... Id expect to negotiate a small fee and a sworn commitment to secrecy :-)

      First things first - id run the list thru scrapebox index checker 1,000 at a time and removed the already indexed and cached pages/urls. You need proxies - good ones.

      Whats left over from there - you launder.

      This could be interestin'
      • [1] reply
  • for me, i rather not to ping search engines, i prefer to let them discover themselves.

    today discover 2, tomorrow another 2 or 4 ... im just worried to get penalize by them or sandbox
  • The problem is that search engines will never discover them all. It just doesn't happen on it's own and that's reason for getting them indexed.
    • [1] reply
    • From the mouth of Google themselves

      "Offer a site map to your users with links that point to the important parts of your site. If the site map has an extremely large number of links, you may want to break the site map into multiple pages."

      If you try to slap up a page with 30K links or even 5K I fear google may not index anything at all.
      • [1] reply
  • Lets just put it this way I own a lot of websites and have a lot of offline clients. With each website I am pretty aggressive as I get paid on performance so the faster I get results the faster I get paid.

    I have never had a website banned as I am not a big believer in that type of stuff. I would need a min of 5k links a day indexed.
    • [1] reply
    • 5k a day every day? as in you'll be adding a NEW batch of 5k links into the system daily? Or you have this one project that will need to be accomplished in the next X # of days.

      My methods may not pencil for a One Shot Wonder type project, but if its something you'd need to do from now on ... perhaps.

      Then one needs to ponder - is it worth the time and effort and expense to create all this to get a boatload of PR n/a or low pr low quality links from NON diverse domains indexed by means other than natural ...

      Putting a dabble of caviar on steaming pile of dung - doesnt make it edible cuisine - if you know what I mean.

Next Topics on Trending Feed

  • 22

    Guys, I am looking for an automated solution that can index anywhere from 30-50k links. I know there are several tools that will index small amounts of links, but I need something that will index large amounts for me.