Problems getting links with GSA for tier1

6 replies
  • SEO
  • |
Hello,

I'm trying build a tier1 with GSA, and I'm setting GSA very restrictive.

I'm only want links on articles, 2.0 and wikis. Only do follow and contextual links





I've set the keywords like this





In options tab, I've put this. I've chosen only the english search engines.




In filters:





I only want links on articles and articles-wiki. I filtered by english counties and I've checked avoid all not english sites.


Over night, GSA only did 3 submits and I get this message:


"no targets to post to (maybe blocked by search engines no site list enabled no scheduled posting)"




It seems I'm being too much restrictive, but must be thousand of sites for submit articles, 2.0, wikis, etc with this configuration.


Could anybody help me?




Thank you very much
#conseguir #enlaces #gsa #para #problemas
  • Profile picture of the author jinx1221
    Originally Posted by poerbucker View Post

    must be thousand of sites for submit articles, 2.0, wikis, etc
    You would think so with all the automation software boasting on their homepage how it submits to thousands of sites. Not so much. In fact, of the sites most software lists as sites they submit to, only about 20% actually work. Most of them, their list is either so old that they don't exist anymore, don't accept new signups, are now paid, don't post contextual links, etc etc. I'm not putting the software down (yes i am), it's just a fact. When a submission software website says words like 'thousands', they are more talking about 'possible' sites, only with that you would have to include your own list of sites like wiki sites, WPMU, and other platform sites. That's GSA's strength. I personally wouldn't use these for tier 1 though. Just some thoughts
    Signature

    The Ultimate Private Network Management,
    Visualization and Automation Tool




    {{ DiscussionBoard.errors[9311164].message }}
  • Profile picture of the author MikeFriedman
    This may sound stupid, but are you using proxies? If not, it could be a problem of Google blocking your IP and GSA not being able to scrape targets.

    The same thing could be happening if you are using bad proxies.
    {{ DiscussionBoard.errors[9311722].message }}
    • Profile picture of the author Laubster
      Originally Posted by MikeFriedman View Post

      This may sound stupid, but are you using proxies? If not, it could be a problem of Google blocking your IP and GSA not being able to scrape targets.

      The same thing could be happening if you are using bad proxies.
      Your quality standards are pretty high so naturally you are going to get lower submissions. Second, GSA almost NEVER identified web 2.0's, so you're basically just building articles.

      But yes your problem is GSA is not finding targets to post to in Google, meaning you have proxy problems or need a much larger keyword list so GSA can identify targets. Also by "skipping sites" from certain countries you are limiting your options. GSA analyzes based on the server's IP, and many English based domains are going to have higher spam filters, meaning GSA can't post to them.
      {{ DiscussionBoard.errors[9398580].message }}
  • Profile picture of the author SEO Eddie
    It's probably best to try the GSA forum for detailed advice about this.
    {{ DiscussionBoard.errors[9311843].message }}
  • Profile picture of the author npoint
    GSA Forum is the best place for you, really. There are so many things it`s hard to mention everything here (((:

    The first thing is you should avoid checked search engines cos it will slow you down a lot. Make your 1st tier reverified every 24 hours, then build 2nd tier to it, use good indexing service (send to indexing service - other) with drip feed . Inedxing is very important, undindexed backlinks are useless, and completely waste of your time.
    Signature
    Check your SEO agency work - free of charge. VPS (root)/ Dedicated servers. Traffic bot (free try
    {{ DiscussionBoard.errors[9396394].message }}
  • Profile picture of the author npoint
    For scraping target sites purchase GSCRAPER, using GSA as a scraper is waste of its potential, GSA SER is the best in posting, not scraping, GSCRAPER do it a lot faster and more effective. So unchecking search engines should speed up a lot your submission process. This what I could advice you to improve your speed yet are check only 3 platforms (if you need a highest speed only) , limit your platforms to

    - buddypress
    - xpressengines
    -drupal

    I am processin huge lists on dedi and see those 3 are the most succesfull so on the rest my GSA SER waste the time only. Once I check only those 3 platforms my LPM increase over 100, sometimes 140-150 depending on how many private proxies I am using and how good condition is my list. If you need more diversify of course whole process will be a lot slower. If you want to increase them on other platforms you could also decrease the timeout but the risk of that is you will limit amount of target sites you will be able to post, cos some of them are not on very fast hostings so they response is not very fast. It`s all depends on this what you need, if you require more diversify you have to agree with lower LPM, it`s normal, and even 100 private proxies will not gives you more than 40-50 LPMs (response time of target sites), but if you need a speed for your 2nd tier, you can limit to those 3 platforms and get over 100 LPMs.
    Signature
    Check your SEO agency work - free of charge. VPS (root)/ Dedicated servers. Traffic bot (free try
    {{ DiscussionBoard.errors[9398931].message }}

Trending Topics