PBN & Expired domains is hot topic this days in SEO world

33 replies
  • SEO
  • |
Now that expired domains and building PBN from these expired domains is hot topic. Would you believe that one of this days google will hit sites using this technique?

As a coder point of view, it is easier to traced if an expired domain was re-purpose for the sake of sending link to a specific site.

Maybe one of this days they will reset all the juice of any domains that already expired, which have different content and niche from it's previous content.

I believe that restoring the old content of the expired domain is more safer than using expired domains in total different niche.

What do you think?
#days #domains #expired #hot #pbn #seo #topic #world
  • Profile picture of the author Mike Anthony
    Originally Posted by edpudol1973 View Post

    Maybe one of this days they will reset all the juice of any domains that already expired, which have different content and niche from it's previous content.

    What do you think?
    I think that will eventually happen which is why I tell everyone to get going with their PBN buying now. it will not be an option to buy expired domains forever. Once Google pulls that switch every domain expiring after that will have no value for SEO. its the one thing that WILL work unlike all the theories people have been floating about.

    In addition they can monitor domains that do not expire but go through auctions and do the same thing.

    PBN auction domain buying......toast.
    Signature

    {{ DiscussionBoard.errors[9684129].message }}
  • Profile picture of the author Mike Anthony
    Originally Posted by edpudol1973 View Post

    I believe that restoring the old content of the expired domain is more safer than using expired domains in total different niche.

    What do you think?
    You might mean something different so this is not aimed at you but I wish people woud stop using the word "restoring" and call it what it is

    STEALING other peoples content.

    Whatever blow back you get from the previous owner will be well deserved
    Signature

    {{ DiscussionBoard.errors[9684133].message }}
    • Profile picture of the author edpudol1973
      Originally Posted by Mike Anthony View Post

      You might mean something different so this is not aimed at you but I wish people woud stop using the word "restoring" and call it what it is

      STEALING other peoples content.

      Whatever blow back you get from the previous owner will be well deserved
      I agree with you that those old content is owned by the previous owner and it should not be used for personal or commercial used.

      But on second thought archive.org using them anyway without the permission of the site owner. And the previous owner abandoned already the site and its content, do you think still unethical?
      {{ DiscussionBoard.errors[9684147].message }}
      • Profile picture of the author Mike Anthony
        Originally Posted by edpudol1973 View Post


        But on second thought archive.org using them anyway without the permission of the site owner. And the previous owner abandoned already the site and its content, do you think still unethical?
        Archive.org is not stealing content from the site but taking a snapshot like Google cache does. Marketers are not taking snapshots of the sites. they are stealing the content and then associating the content with their own money sites.

        Theres no way to spin around it. its stealing content and using it for yourself. Done enough its bound to tick off a former owner (in some cases that doesn't realize their domains have expired until months later) . If thats ever done to my former sites i'd submit to Google and start Mr DMCA rolling
        Signature

        {{ DiscussionBoard.errors[9684164].message }}
  • Profile picture of the author chris_87
    Originally Posted by edpudol1973 View Post

    Maybe one of this days they will reset all the juice of any domains that already expired, which have different content and niche from it's previous content.
    It seems like they are taking more conservative steps, for example no longer updating PR according to John Mueller. Relevance seems like the route they are taking instead, if they start looking at the relevance of the links that would put an end to many networks. Although as it currently stands with relevance, they are doing a pretty poor job.

    That and footprints that make it easy to identify junk sites (blocking bots, homepage only, posts etc).
    {{ DiscussionBoard.errors[9684135].message }}
    • Profile picture of the author Mike Anthony
      Originally Posted by chris_87 View Post

      It seems like they are taking more conservative steps, for example no longer updating PR according to John Mueller. Relevance seems like the route they are taking instead, if they start looking at the relevance of the links that would put an end to many networks
      Thats a theory that many people here are trying to float but don't be fooled by them . the normal organic expiring domain with juice gets a SUBSTANTIAL amount of links as naked URLs. Anchor text link relevance would not drastically affect the practice of PBNs.
      Signature

      {{ DiscussionBoard.errors[9684149].message }}
      • Profile picture of the author chris_87
        Originally Posted by Mike Anthony View Post

        Thats a theory that many people here are trying to float but don't be fooled by them . the normal organic expiring domain with juice gets a SUBSTANTIAL amount of links as naked URLs. Anchor text link relevance would not drastically affect the practice of PBNs.
        I am not specifically talking about anchor text. More so the back link pages themselves. Is the backlink page part of a niche relevant domain, or at least is the page relevant? Is there anything relevant within the <h1>, <title> or context of the page?

        That is what I am talking about.
        {{ DiscussionBoard.errors[9684159].message }}
        • Profile picture of the author Mike Anthony
          Originally Posted by chris_87 View Post

          I am not specifically talking about anchor text. More so the back link pages themselves. Is the backlink page part of a niche relevant domain, or at least is the page relevant? Is there anything relevant within the <h1>, <title> or context of the page?

          That is what I am talking about.
          Yeah I know the argument but it can't work to shut off all juice. Non relevant sites link to other niches ALL THE TIME. People have argued this a lot recently but its impossible to work with due to how deep and wide people refer to thing s and link to sites. One example should make it clear - newspapers link all the time to pages that don't have relevance to anything in H1 tags or the page. Humans are just not that limited.

          A link from CNN not good because its not relevant to the H1 tags? I'll believe that when I see it.
          Signature

          {{ DiscussionBoard.errors[9684183].message }}
          • Profile picture of the author chris_87
            Originally Posted by Mike Anthony View Post

            One example should make it clear - newspapers link all the time to pages that don't have relevance to anything in H!1tags or the page. Humans are just not that limited.
            Newspapers, Business directories both wide (best of the web, dmoz) and narrow (at local, state, city level) have some degree of relevance if they are linking to a business site. I am not saying the backlink page needs to have nich releated phrase or keyword in the <title> or <h1> or anything that specific. But if you buy a domain about dog grooming and repurpose it to be about forex trading, then it just becomes obvious what you are trying to do.

            Originally Posted by Mike Anthony View Post

            A link from CNN not good because its not relevant to the H1 tags? I'll believe that when I see it.
            Of course it has value, I never said anything to that extreme.
            {{ DiscussionBoard.errors[9684202].message }}
            • Profile picture of the author Mike Anthony
              Originally Posted by chris_87 View Post

              Of course it has value, I never said anything to that extreme.
              Well theres been enough discussion and argument on that recently so I am totally bored with it.

              call me if it ever happens.

              As long as there is sufficient juice which you concede anyway then PBNs would continue to work.
              Signature

              {{ DiscussionBoard.errors[9684241].message }}
    • Profile picture of the author MikeFriedman
      Originally Posted by chris_87 View Post

      It seems like they are taking more conservative steps, for example no longer updating PR according to John Mueller. Relevance seems like the route they are taking instead, if they start looking at the relevance of the links that would put an end to many networks. Although as it currently stands with relevance, they are doing a pretty poor job.
      I don't think they could really take that approach effectively. Once you go out beyond the first tier of links for most websites, things quickly become irrelevant and what would be considered spammy.

      Originally Posted by chris_87 View Post

      That and footprints that make it easy to identify junk sites (blocking bots, homepage only, posts etc).
      This is the easiest way for them to combat private networks.
      Signature

      For SEO news, discussions, tactics, and more.
      {{ DiscussionBoard.errors[9684209].message }}
  • Profile picture of the author Mike Anthony
    The best way would be to reset expired and auction domains and I think they even might have tried in the past. The problem they are probably having is stopping Gbot from picking back up the links as it crawls the web. Any change to a page would render it as potentially updated page and the link with it.
    Signature

    {{ DiscussionBoard.errors[9684261].message }}
    • Profile picture of the author deezn
      Originally Posted by Mike Anthony View Post

      The best way would be to reset expired and auction domains and I think they even might have tried in the past. The problem they are probably having is stopping Gbot from picking back up the links as it crawls the web. Any change to a page would render it as potentially updated page and the link with it.
      That's the thing. They would have to store all disqualified links somewhere no?
      {{ DiscussionBoard.errors[9685201].message }}
    • Profile picture of the author jinx1221
      Originally Posted by Mike Anthony View Post

      The best way would be to reset expired and auction domains and I think they even might have tried in the past. The problem they are probably having is stopping Gbot from picking back up the links as it crawls the web. Any change to a page would render it as potentially updated page and the link with it.
      One other reason I can think of for why they don't reset them, (yet anyways), is for what you pointed out a few posts back, that a previous owner might not have realized that his site expired, then renewed it, now the site would be back to square one again. That might make some people a little mad if they worked their site up to page one. Or maybe they will have a set timeframe to reset them (a year?).. or who knows, maybe there just aren't enough people that forget to renew, enough to just ignore that possibility and reset them anyways (because of the greater purpose, doing away with pbn's)
      Signature

      The Ultimate Private Network Management,
      Visualization and Automation Tool




      {{ DiscussionBoard.errors[9685270].message }}
    • Profile picture of the author edpudol1973
      Originally Posted by Mike Anthony View Post

      The best way would be to reset expired and auction domains and I think they even might have tried in the past. The problem they are probably having is stopping Gbot from picking back up the links as it crawls the web. Any change to a page would render it as potentially updated page and the link with it.
      If they reset everything to expired and auction domains, all they need to do is adjust their algorithm to consider all these domains as new. And all links that older than the latest whois update will not crawl by their bot and will not add as back link.
      {{ DiscussionBoard.errors[9685470].message }}
      • Profile picture of the author Kevin Maguire
        Originally Posted by edpudol1973 View Post

        Actually it is not assume, a coder can create a system (programers mind) that will check the following to identify if the site was re-purpose or not.

        1. Compare the current content and the previous year(s) content if different then put 1 point
        2. Compare the current niche and the previous niche if not the same then put 1 point
        3. Check if the domain was parked in registrar before the new content is created, if yes 1 point
        4. Check if the domain was posted before in domain auction sites , if yes 1 point
        5. Check if the domain was drop before, if yes 1 point
        6. Does the content of the new site have link to external site that have no authority? If yes 1 point.

        The action of the system will depends on how many points the site get. The safest scenario that will happened is to reset all the juice and disregards all old back links and pr of the site.
        The worst scenario is penalized the target site.

        That concept is just from me as a no voice coder. Google have the best coders and engineers, they can figure out the solution better than me.
        Originally Posted by edpudol1973 View Post

        If they reset everything to expired and auction domains, all they need to do is adjust their algorithm to consider all these domains as new. And all links that older than the latest whois update will not crawl by their bot and will not add as back link.
        Inbound links don't come with a "Date of Creation Tag".
        Dictionary words are finite, it's inevitable at some point in time all popular worded domains will have more then 1 owner. Which could easily have completely different subject matter.
        Big corpa would choke Google as they buy legit websites all day long, paying huge amounts of money for them.

        And users create relevance not Google.
        {{ DiscussionBoard.errors[9685556].message }}
        • Profile picture of the author nik0
          Banned
          Call it restoring content or call it stealing, it doesn't matter the law is very clear about this.

          If you don't damage the company by doing so they have to warn you first, and when you follow up by removing the content it's a done case.
          {{ DiscussionBoard.errors[9685613].message }}
          • Profile picture of the author Kevin Maguire
            Originally Posted by nik0 View Post

            Call it restoring content or call it stealing, it doesn't matter the law is very clear about this.

            If you don't damage the company by doing so they have to warn you first, and when you follow up by removing the content it's a done case.
            I pray you never get hit then, your ideas of "damages" might differ from a courts idea somewhat.
            {{ DiscussionBoard.errors[9685670].message }}
            • Profile picture of the author nik0
              Banned
              Originally Posted by Kevin Maguire View Post

              I pray you never get hit then, your ideas of "damages" might differ from a courts idea somewhat.
              Personally I don't bother that much about it.

              Does become a bit risky when using VA's to perform such tasks though.
              {{ DiscussionBoard.errors[9685704].message }}
          • Profile picture of the author Mike Anthony
            Originally Posted by nik0 View Post

            Call it restoring content or call it stealing, it doesn't matter the law is very clear about this.
            Yep it is which is why people who steal content and get caught are never found by the courts to b e in the right

            If you don't damage the company by doing so they have to warn you first, and when you follow up by removing the content it's a done case.
            Bull. Spoken like a content crook. IN your case you actually SELL other peoples content you stole from their wayback machine entry. If you think you or your customers can't be sued for that you are delusional.
            Signature

            {{ DiscussionBoard.errors[9686487].message }}
        • Profile picture of the author edpudol1973
          Originally Posted by Kevin Maguire View Post

          Inbound links don't come with a "Date of Creation Tag.
          But the date that page where link came from was recorded.
          {{ DiscussionBoard.errors[9686246].message }}
          • Profile picture of the author yukon
            Banned
            Originally Posted by edpudol1973 View Post

            But the date that page where link came from was crawled was recorded.
            I have no doubts about that.

            We already know Google saves the cache dates, it's in plain view.

            I'm also sure Google has their own version of Wayback, saving all those old cached pages/URLs. If Wayback has the server capacity, you know Google does.
            {{ DiscussionBoard.errors[9686291].message }}
            • Profile picture of the author edpudol1973
              Originally Posted by yukon View Post

              I have no doubts about that.

              We already know Google saves the cache dates, it's in plain view.

              I'm also sure Google has their own version of Wayback, saving all those old cached pages/URLs. If Wayback has the server capacity, you know Google does.
              Totally agree, for sure they have their own database.

              This is the reason why many old user of this strategy guarded it for so long, they are afraid to reveal to public because it may abuse. And now it's happening.
              {{ DiscussionBoard.errors[9686421].message }}
          • Profile picture of the author Kevin Maguire
            Originally Posted by edpudol1973 View Post

            But the date that page where link came from was recorded.
            That's absolutely no indicator of link creation. Post dates are as optional as anything. But more to the point, using my own site as an example.

            I have a main very extensive group of resource pages, I frequently change and update the content of those pages, new content, info, resource links etc..

            Even Google wants me to "keep my content updated". I guess as others have pointed out, if it where that easy, it would have been done years ago.
            {{ DiscussionBoard.errors[9687407].message }}
            • Profile picture of the author chris_87
              Originally Posted by Kevin Maguire View Post

              That's absolutely no indicator of link creation. Post dates are as optional as anything. But more to the point, using my own site as an example.
              If you use the daterange: command in julian format the results returned are when google first cached the page.
              {{ DiscussionBoard.errors[9687424].message }}
              • Profile picture of the author Kevin Maguire
                Originally Posted by chris_87 View Post

                If you use the daterange: command in julian format the results returned are when google first cached the page.
                Yes thanks for confirming my point, date of page creation has no correlation to date of cache.

                This threads turning out funny though, as Google haven't figured out a way to do this yet. (Obviously) You guys are sure doing a hell of a job of figuring it all out for them publicly.
                {{ DiscussionBoard.errors[9687446].message }}
            • Profile picture of the author yukon
              Banned
              Originally Posted by Kevin Maguire View Post

              That's absolutely no indicator of link creation. Post dates are as optional as anything. But more to the point, using my own site as an example.

              I have a main very extensive group of resource pages, I frequently change and update the content of those pages, new content, info, resource links etc...
              Let me tell you something about post dates, I've been faking dates in the SERPs for years (thank you php). The reason I do that is my content is evergreen, some from 2005 on my old sites. With recent dates it looks like my old evergreen content is new to traffic.

              When I do a site:domain search & switch the Google SERP date filter to show pages from the last month, Google says I have dozens of new pages (lol) when I haven't updated those old sites with new content pages probably in a year.

              Obviously post date isn't in any way related to a cache date. Now I know Google is smarter than to think my old pages are new, just saying their SERP date is whatever you want it to be without manually updating any webpage.



              Originally Posted by Kevin Maguire View Post

              Even Google wants me to "keep my content updated".
              Just lol


              *************************************************

              I'm still going with Google has their own Wayback (aka cache). At the very least a date first found & date last found on webpages.
              {{ DiscussionBoard.errors[9687527].message }}
              • Profile picture of the author Mike Anthony
                You can go with whatever you want its still nothing more than day dreaming and Caches are small (not comprehensive storage of all previous versions of all sites)

                http://en.wikipedia.org/wiki/Cache_(computing)

                Date first found and last found would not tell Google when the link was or was not placed anyway since writers revise content all the time.
                Signature

                {{ DiscussionBoard.errors[9688854].message }}
      • Profile picture of the author Mike Anthony
        Originally Posted by edpudol1973 View Post

        If they reset everything to expired and auction domains, all they need to do is adjust their algorithm to consider all these domains as new. And all links that older than the latest whois update will not crawl by their bot and will not add as back link.
        pages change and update the entire page and sources even if you just add a letter or an image in the sidebar.

        We already know Google saves the cache dates, it's in plain view.

        I'm also sure Google has their own version of Wayback, saving all those old cached pages
        I am sure they don't because corporations do not waste huge resources for little purpose. The wayback machine is no comparison. It does not cover ton loads of sites and even the sites that it does cover it crawled a fraction of the days the site was up.

        For a company to save every page on the internet going back 15 years and then a program to sift through all that is just a ridiculous load on whatever they use. I bet even the NSA doesn't store that kind of data and their systems costs billions of dollars per year.


        Proof is in the pudding. If Google had all the resources so easy to use then PBns would have been toast long ago. That tells the smart person all they need to know.

        We are getting back into the Google God fallacy but you guys can run with it....its entirely speculation stated as fact.
        Signature

        {{ DiscussionBoard.errors[9686474].message }}
  • Profile picture of the author hipeopo02
    Originally Posted by edpudol1973 View Post

    Maybe one of this days they will reset all the juice of any domains that already expired, which have different content and niche from it's previous content.
    This is where Im a little confused.

    I thought G already did this. When a domain gets re-registered doesn't it need to be recrawled any way? As long as the links are there what's the problem?

    Maybe the links that stick are coming from webmasters that like the new site on the domain.

    G can't assume a repurposed domain is repurposed to game their search engine until they get definitive proof.
    {{ DiscussionBoard.errors[9684265].message }}
    • Originally Posted by hipeopo02 View Post

      This is where Im a little confused.

      I thought G already did this. When a domain gets re-registered doesn't it need to be recrawled any way? As long as the links are there what's the problem?

      Maybe the links that stick are coming from webmasters that like the new site on the domain.

      G can't assume a repurposed domain is repurposed to game their search engine until they get definitive proof.
      And how exactly would they know that a repurposed domain is being used to game their search engines? They wouldn't..
      {{ DiscussionBoard.errors[9685266].message }}
    • Profile picture of the author edpudol1973
      Originally Posted by hipeopo02 View Post

      G can't assume a repurposed domain is repurposed to game their search engine until they get definitive proof.
      Actually it is not assume, a coder can create a system (programers mind) that will check the following to identify if the site was re-purpose or not.

      1. Compare the current content and the previous year(s) content if different then put 1 point
      2. Compare the current niche and the previous niche if not the same then put 1 point
      3. Check if the domain was parked in registrar before the new content is created, if yes 1 point
      4. Check if the domain was posted before in domain auction sites , if yes 1 point
      5. Check if the domain was drop before, if yes 1 point
      6. Does the content of the new site have link to external site that have no authority? If yes 1 point.

      The action of the system will depends on how many points the site get. The safest scenario that will happened is to reset all the juice and disregards all old back links and pr of the site.
      The worst scenario is penalized the target site.

      That concept is just from me as a no voice coder. Google have the best coders and engineers, they can figure out the solution better than me.
      {{ DiscussionBoard.errors[9685453].message }}
      • Profile picture of the author hipeopo02
        Originally Posted by edpudol1973 View Post

        Actually it is not assume, a coder can create a system (programers mind) that will check the following to identify if the site was re-purpose or not.

        1. Compare the current content and the previous year(s) content if different then put 1 point
        2. Compare the current niche and the previous niche if not the same then put 1 point
        3. Check if the domain was parked in registrar before the new content is created, if yes 1 point
        4. Check if the domain was posted before in domain auction sites , if yes 1 point
        5. Check if the domain was drop before, if yes 1 point
        6. Does the content of the new site have link to external site that have no authority? If yes 1 point.
        I guess programming is a way to put certain domains on a "watch list" but that's it...

        The problem is G is not a "judge" they react to things that they KNOW are there not that they THINK are there.

        Look at the damage they do to legit webmasters when they roll out a new algo.

        imagine if they started acting on a hunch that someone was up to no good....hahah

        your "1 point" programming script can't give a definitive answer about a human's intent.
        {{ DiscussionBoard.errors[9686531].message }}

Trending Topics