guide to finding high pr blogs using scrapebox?

26 replies
  • SEO
  • |
Any guides to finding high pr actual pages that I can comment on using scrapebox?
#blogs #finding #guide #high #scrapebox
  • Profile picture of the author MikeFriedman
    Scrape ==> Check PR of URL ==> Eliminate low PR pages ==> Run through Blog Analyzer to see which ones are still open for comments ==> Profit
    Signature
    SEO, AdWords Management, Social Media Marketing, and more.
    Get a FREE Quote.
    {{ DiscussionBoard.errors[5778078].message }}
  • Profile picture of the author JSProjects
    Originally Posted by allsystems View Post

    Any guides to finding high pr actual pages that I can comment on using scrapebox?
    Short version:

    1 - Harvest your desired keywords.

    2 - Remove duplicate DOMAINS from your results.

    3 - Run a PR check on the domains. Save any that have PR. I, personally, avoid any domains that are PR1 to help speed up the process.

    4 - Trim your results to root so you're left with just the domains. Copy them into the harvester window and add "site:" into the custom field box. This will harvest all of the inner pages.

    IMPORTANT: If you want to find only active blogs set it so Google only displays results from within 30 days. This filters out the blogs that haven't been updated within the past month. (For the most part.)

    5 - Now you've got a list of PR2+ blogs that have been updated within the past 30 days. Remove duplicate domains, again, and trim them to root.

    6 - Re-run the harvester using the "site:" footprint and make sure you're searching for ALL results, not just within the past 30 days.

    7 - Remove duplicate URLs from your harvested list. You should now be left with posts from active blogs.

    8 - Run the results through the blog analyzer to pinpoint posts that are open for comments.

    9 - Run a PR check on the posts that are open. You should find plenty of PR1+ posts if you've got a lot of blogs.

    Edit: OK, Mike's was definitely the shorter version.
    {{ DiscussionBoard.errors[5778105].message }}
    • Profile picture of the author jamessmbseo
      Originally Posted by JSProjects View Post

      Short version:

      1 - Harvest your desired keywords.

      2 - Remove duplicate DOMAINS from your results.

      3 - Run a PR check on the domains. Save any that have PR. I, personally, avoid any domains that are PR1 to help speed up the process.

      4 - Trim your results to root so you're left with just the domains. Copy them into the harvester window and add "site:" into the custom field box. This will harvest all of the inner pages.

      IMPORTANT: If you want to find only active blogs set it so Google only displays results from within 30 days. This filters out the blogs that haven't been updated within the past month. (For the most part.)

      5 - Now you've got a list of PR2+ blogs that have been updated within the past 30 days. Remove duplicate domains, again, and trim them to root.

      6 - Re-run the harvester using the "site:" footprint and make sure you're searching for ALL results, not just within the past 30 days.

      7 - Remove duplicate URLs from your harvested list. You should now be left with posts from active blogs.

      8 - Run the results through the blog analyzer to pinpoint posts that are open for comments.

      9 - Run a PR check on the posts that are open. You should find plenty of PR1+ posts if you've got a lot of blogs.

      Edit: OK, Mike's was definitely the shorter version.

      Thanks, this will help a lot for a newbie like me... no matter how long it is...
      {{ DiscussionBoard.errors[6108318].message }}
      • Profile picture of the author JSProjects
        Originally Posted by jamessmbseo View Post

        Thanks, this will help a lot for a newbie like me... no matter how long it is...
        No problem. It's not all that complicated. But it can definitely be a little time consuming. Especially if you're working with a LOT of blogs.

        If you're looking for a shortcut you can tick the "results from the last month" option during your initial harvester run.
        {{ DiscussionBoard.errors[6111536].message }}
    • Profile picture of the author jeremynet
      Originally Posted by JSProjects View Post

      Short version:

      1 - Harvest your desired keywords.

      2 - Remove duplicate DOMAINS from your results.

      3 - Run a PR check on the domains. Save any that have PR. I, personally, avoid any domains that are PR1 to help speed up the process.

      4 - Trim your results to root so you're left with just the domains. Copy them into the harvester window and add "site:" into the custom field box. This will harvest all of the inner pages.

      IMPORTANT: If you want to find only active blogs set it so Google only displays results from within 30 days. This filters out the blogs that haven't been updated within the past month. (For the most part.)

      5 - Now you've got a list of PR2+ blogs that have been updated within the past 30 days. Remove duplicate domains, again, and trim them to root.

      6 - Re-run the harvester using the "site:" footprint and make sure you're searching for ALL results, not just within the past 30 days.

      7 - Remove duplicate URLs from your harvested list. You should now be left with posts from active blogs.

      8 - Run the results through the blog analyzer to pinpoint posts that are open for comments.

      9 - Run a PR check on the posts that are open. You should find plenty of PR1+ posts if you've got a lot of blogs.

      Edit: OK, Mike's was definitely the shorter version.
      Thanks for sharing JSProjects, have you ever tried to run all these by the new addon automator?
      Signature

      Did you see the skin miracle video? >>> Click here to watch

      {{ DiscussionBoard.errors[8562946].message }}
      • Profile picture of the author Greg99
        Just wanted to say thanks JSprojects for giving me another insite into how to use Scrapebox correctly...I know this software can damage sites unless used correctly....so here goes
        {{ DiscussionBoard.errors[8563363].message }}
  • Profile picture of the author allsystems
    Wow seems like it might take a while to do,thanks for the help fellas
    {{ DiscussionBoard.errors[5778156].message }}
  • Profile picture of the author JSProjects
    Yeah, if you want to get REALLY specific then it can be time consuming. Not necessarily difficult, just takes time. I usually take it a step further though and check the number of outbound links, dofollow / nofollow status, etc.
    {{ DiscussionBoard.errors[5778218].message }}
  • Profile picture of the author allsystems
    I think i might just download some autoapprove lists,i just need the for tier 2 and tier 3. I need some high blogs though for my money site,wanted pr4-pr7 for direct llink to money site
    {{ DiscussionBoard.errors[5778289].message }}
  • Profile picture of the author patco
    If you want it fast, you can get thousands of lists online.. But otherwise, you should check do-follow backlinks in your niche, so harvest for your main keywords, remove DUPL, trim to root and check PR until you build a list for your personal use
    Signature

    A blog that will show you How to Lose Weight with a cool Quick Weight Loss guide...
    Also enjoy some of my favorite Funny pictures and photos that will make you smile :)

    {{ DiscussionBoard.errors[5778798].message }}
  • Profile picture of the author gearmonkey
    I love scrapebox but it's so time consuming trying to find proxies. And buying them is confusing. So I outsource. Lol
    Signature

    My Guitar Website | My SEO Blog - Advertising spots available.

    {{ DiscussionBoard.errors[5779539].message }}
  • Profile picture of the author dminorfmajor
    I just bought it last night and proxies have been the confusing part and it's actually kept me from using the software. I'm so scared of getting my IP banned by google. Where do you guys get your proxies? Or more importantly, how do you find your own?
    {{ DiscussionBoard.errors[5779687].message }}
    • Profile picture of the author gearmonkey
      Originally Posted by dminorfmajor View Post

      I just bought it last night and proxies have been the confusing part and it's actually kept me from using the software. I'm so scared of getting my IP banned by google. Where do you guys get your proxies? Or more importantly, how do you find your own?
      Yeah, that's why I don't use the software much. My IP has actually been banned by google once (for 24 hours) and now I don't mess with SB much. It's easier to outsource.
      Signature

      My Guitar Website | My SEO Blog - Advertising spots available.

      {{ DiscussionBoard.errors[5779717].message }}
    • Profile picture of the author Andylinks
      Originally Posted by dminorfmajor View Post

      I just bought it last night and proxies have been the confusing part and it's actually kept me from using the software. I'm so scared of getting my IP banned by google. Where do you guys get your proxies? Or more importantly, how do you find your own?
      The best bet is to buy your proxies.
      {{ DiscussionBoard.errors[5779966].message }}
    • Profile picture of the author JSProjects
      Originally Posted by dminorfmajor View Post

      I just bought it last night and proxies have been the confusing part and it's actually kept me from using the software. I'm so scared of getting my IP banned by google. Where do you guys get your proxies? Or more importantly, how do you find your own?
      Public proxies for scraping, private (or even shared) for posting. You don't need a lot to post. 5 is fine if you're not posting to hundreds of thousands of blogs.

      Buy proxies for scrapebox, xrumer, twitter| where to buy proxies

      I've been using them for a little over 6 months and they're great.

      As far as public proxies, don't waste your time with the built-in Scrapebox sources. Find yourself a few sources that you can import into SB and check. Even better, find sources that aren't compatible directly. (Ones that you'd need to manually copy and paste into the proxy checker.) This means that most automated tools can't find them. So they're more likely to stay alive. I've got a couple of sources that I check each morning and end up with about 1,500-2,000 public proxies that will last me throughout the day.

      Also, I suggest using the "new" proxy harvester. (It's an option in the SB menu.) The old version will mark your proxy as bad if it fails the Google test. Even if it's work with Bing, Yahoo, for other tasks, etc.
      {{ DiscussionBoard.errors[5785129].message }}
    • Profile picture of the author TheProgrammer
      Originally Posted by dminorfmajor View Post

      I just bought it last night and proxies have been the confusing part and it's actually kept me from using the software. I'm so scared of getting my IP banned by google. Where do you guys get your proxies? Or more importantly, how do you find your own?
      The private proxies is the most important part in SB, without that proxies you can't get good results in harvesting and posting. this site is charging lowest fee on 50 proxies.
      Scrapebox Private and Public Proxies | Scrapebox Autoapprove Lists, SEO Services, Proxies, Tools, Tutorials | Email Marketing Leads
      {{ DiscussionBoard.errors[6108279].message }}
  • Profile picture of the author faisalmaximus
    Hope it will be effective, I am gonna make a try !!!
    {{ DiscussionBoard.errors[5785367].message }}
  • Profile picture of the author looking4adsense
    if you want your site to be delisted and banned by google, then go ahead and use scrapebox
    {{ DiscussionBoard.errors[5785389].message }}
    • Profile picture of the author JSProjects
      Originally Posted by looking4adsense View Post

      if you want your site to be delisted and banned by google, then go ahead and use scrapebox
      Pretty broad statement.

      I can accomplish and automate about a hundred different tasks with Scrapebox, none of which include using the fast poster.
      {{ DiscussionBoard.errors[5785626].message }}
  • Profile picture of the author oxnard111
    JSProjects,

    I'm new to Scrapebox, how do you do the step you outline as IMPORTANT in your guide above? The step that only finds active blogs within the last 30 days in Google?

    Thanks!
    {{ DiscussionBoard.errors[5860918].message }}
    • Profile picture of the author JSProjects
      Originally Posted by oxnard111 View Post

      JSProjects,

      I'm new to Scrapebox, how do you do the step you outline as IMPORTANT in your guide above? The step that only finds active blogs within the last 30 days in Google?

      Thanks!
      Here's a screenshot. This is generally how I ensure that blogs are active. If they've got posts within 30 days then they're most likely not abandoned.

      {{ DiscussionBoard.errors[6105143].message }}
  • Profile picture of the author oxnard111
    Okay... can anybody help me out. I still can't figure out how to narrow my results to the past 30 days.

    I can't create a new post due to the 15 post limit, and I can't PM anybody due to the 50 post limit.

    Please help!
    {{ DiscussionBoard.errors[5874089].message }}
  • Profile picture of the author allsystems
    wanted to open this again to say a special thanks to JS projects. I am following this method now and am finding some very good blogs
    {{ DiscussionBoard.errors[6105062].message }}
    • Profile picture of the author JSProjects
      Originally Posted by allsystems View Post

      wanted to open this again to say a special thanks to JS projects. I am following this method now and am finding some very good blogs
      Thanks! Great to hear. SB definitely goes far beyond being a simple automated blog commenter.

      The method I outlined above is part of a larger method I use to put together the blog packs I offer.
      {{ DiscussionBoard.errors[6108140].message }}
  • {{ DiscussionBoard.errors[6109810].message }}

Trending Topics