Internal Research from Facebook Shows that Re-Shares Can Significantly Amplify Misinformation

by WarriorForum.com Administrator
13 replies
A new article on Social Media Today asks what if Facebook removed post shares entirely, as a means to limit the spread of misinformation in its apps? What impact would that have on Facebook engagement and interaction?



That question comes following the release of new insights from Facebook's internal research, released as part of the broader 'Facebook Files' leak, which shows that Facebook's own reporting found that post shares play a key role in amplifying misinformation, and spreading harm among the Facebook community. As reported by Alex Kantrowitz in his newsletter Big Technology:

"The report noted that people are four times more likely to see misinformation when they encounter a post via a share of a share - kind of like a retweet of a retweet - compared to a typical photo or link on Facebook. Add a few more shares to the chain, and people are five to ten times more likely to see misinformation. It gets worse in certain countries. In India, people who encounter "deep reshares," as the researchers call them, are twenty times more likely to see misinformation."
So it's not direct shares, as such, but re-amplified shares, which are more likely to be the kinds of controversial, divisive, shocking or surprising reports that gain viral traction in the app:

"The study found that 38% of all [views] of link posts with misinformation take place after two reshares. For photos, the numbers increase - 65% of views of photo misinformation take place after two reshares. Facebook Pages, meanwhile, don't rely on deep reshares for distribution. About 20% of page content is viewed at a reshare depth of two or higher.
So what if Facebook eliminated shares entirely, and forced people to either create their own posts to share content, or to comment on the original post, which would slow the rapid amplification of such by simply tapping a button?

Well, Facebook has already made changes on this front, potentially linked to this research. Last year, Facebook-owned (now Meta-owned) WhatsApp implemented new limits on message forwarding to stop the spread of misinformation through message chains, with sharing restricted to 5x per message. Which, WhatsApp says, has been effective:

"Since putting into place the new limit, globally, there has been a 70% reduction in the number of highly forwarded messages sent on WhatsApp. This change is helping keep WhatsApp a place for personal and private conversations."
#amplify #facebook #internal #misinformation #research #reshares #shows #significantly
Avatar of Unregistered
  • Profile picture of the author Jenni30
    Informative post. Thank you for posting
    {{ DiscussionBoard.errors[11686540].message }}
  • Profile picture of the author dave_hermansen
    Makes no sense at all. In order to re-share misinformation, somebody had to share it in the first place. So, sorry, it IS direct posts and the first share that start the spread of misinformation.

    It's the snowball effect that skews the numbers and allows them to reach the conclusion they wanted to reach in the first place. If I I post something and 50 people see it, a certain percentage of those people will share it (we'll say 10%). So now 5 people post it to their 50 people, get the same 10% re-share and now there are 25 sharing it to their 50 and so on, and so on.

    It's sad that someone had to do a "study" to determine something that common sense would provide.

    Now, the real question is ... did they do the same study to see the amount of re-shares of legitimate, reliable information? The numbers would have likely been the same but a study like that doesn't make headlines!
    Signature
    BizSellers.com - The #1 place to buy & sell websites!
    We help sellers get the MAXIMUM amount for their websites and all buyers know that these sites are 100% vetted.
    {{ DiscussionBoard.errors[11686569].message }}
    • Profile picture of the author savidge4
      Originally Posted by dave_hermansen View Post

      Now, the real question is ... did they do the same study to see the amount of re-shares of legitimate, reliable information? The numbers would have likely been the same but a study like that doesn't make headlines!
      Your right... the study is stupid... BUT its also a bit misguiding. Since the last "Update" to the algorithm where negative headlines are propagated more, there is now more "Misinformation" spreading than ever before, so they can make the case they are trying to make.. IE Change Section 230 to include rules to regulate the internet - since right now they are basically breaking the rule by regulating the internet.
      Signature
      Success is an ACT not an idea
      {{ DiscussionBoard.errors[11686590].message }}
      • Profile picture of the author Odahh
        Originally Posted by savidge4 View Post

        Your right... the study is stupid... BUT its also a bit misguiding. Since the last "Update" to the algorithm where negative headlines are propagated more, there is now more "Misinformation" spreading than ever before, so they can make the case they are trying to make.. IE Change Section 230 to include rules to regulate the internet - since right now they are basically breaking the rule by regulating the internet.
        With their algorithms and invasive information gathering practices they are now going to use that information to decide what people can think .

        This is exactly why people have feared the information gathering. By internet giants. And non governmental agencies.
        {{ DiscussionBoard.errors[11686726].message }}
        • Profile picture of the author savidge4
          Originally Posted by Odahh View Post

          With their algorithms and invasive information gathering practices they are now going to use that information to decide what people can think .

          This is exactly why people have feared the information gathering. By internet giants. And non governmental agencies.
          What they "Think" is the long term game yes... what they buy is the longer play - One they have played for years and years now. The thinking was once understanding what one buys to what ones thoughts may be. Then they understoood what one looks at will determine, what one looks at, above and beyond buys are a clear indicator of thought.

          Now they have just begun in using profiling to distribute news as an example. They willoffer up some of what you are interested in - benign sports stuff, or hobby interests... but when it relates to world news they will provide the opposite - or atleast this is what has been going on as of recently.

          I have "Tested" this theory. The "Democrat" me gets Right minded news, and the "Republican" me gets Left minded news. And there is NO middle ground in this. Interestingly the left me has a much more open internet than say the right me. The left me can post on Facebook without restrictions. The Right me on the other hand, I cant even post an item on facebook with out it being "Reviewed".

          On Google the left me can search and get valid results for a search that may or may not include acurate data. The right me doing the same search will get junked up narrative returns that I would have to weed through for hours to get any amount of accurate data.

          I know I am not alone in this... I know the whole plan will fall apart at some point and things like Section 230 will be enforced - I am sure of this because i dont think there is a way n who knows who that anything allowing restriction of speech will ever pass during this administration. Facebook is trying with Ads Google is a bit more subtle about it, but they are pushing for the same thing.

          We HAVE rules and regulations that make the current big tech actions basically Illegal - but they are obviously ignored. At some point the silent majority will really understand this and the push for open communication.
          Signature
          Success is an ACT not an idea
          {{ DiscussionBoard.errors[11686802].message }}
      • Profile picture of the author fusionhostUK
        Easy money for a study though
        {{ DiscussionBoard.errors[11690255].message }}
  • Profile picture of the author terrysmith3
    I think we need to fight disinformation when the first post is published. It is because of the first message that more misinformation is generated.
    {{ DiscussionBoard.errors[11686701].message }}
    • Profile picture of the author dave_hermansen
      Originally Posted by terrysmith3 View Post

      I think we need to fight disinformation when the first post is published. It is because of the first message that more misinformation is generated.
      Well, this study was about "misinformation", not "disinformation". They are two entirely different things.
      Signature
      BizSellers.com - The #1 place to buy & sell websites!
      We help sellers get the MAXIMUM amount for their websites and all buyers know that these sites are 100% vetted.
      {{ DiscussionBoard.errors[11686744].message }}
  • I think this could work with certain types of misinformation, like those used in chain posts, which is really common. But that's just a really small part of a big ecosystem of misinformation.

    I don't think getting rid of shares or limiting it would make a serious dent on misinformation. It might help change some posting habit but I don't think it's going to make an impact.
    {{ DiscussionBoard.errors[11686839].message }}
  • Profile picture of the author Odahh
    Savage

    I agree with you. I just think it will be the silent majority who fixes the problem . Once the world goes back to where big corporations actually have to be profitable.
    {{ DiscussionBoard.errors[11686904].message }}
  • Profile picture of the author plopza
    Every know nothing I know shares misinformation constantly. TBH most shares I see are misinformation. I would say that the people that I see are more likely to share some dump misinformation than something that is worth sharing. For all I care they could abandon the feature completely. The only exception I think is people who share to moderated groups, typically this is stuff I don't mind seeing. Timeline shares however, at least in my newsfeed, are worthless.
    Signature
    {{ DiscussionBoard.errors[11687603].message }}
  • Profile picture of the author A7m3d
    Misinformation will always be part of the internet. They exist in the physical offline world, and the virtual world is no difference.

    The problem here is the scale and extent.

    Plus, enabling big techs to control the information flow will result in bigger issues.

    I don't see there is a real need to change the current situation. It is more like political war than honest efforts to enhance the user experience.

    Indeed, I see now is the right time for developing decentralized internet and new social platforms that are not controlled by just 4-5 big giant companies.

    Platforms that are based on blockchain, where they can operate interchangeably across each other, and exchange and distribute information.


    -A7m3d
    {{ DiscussionBoard.errors[11687747].message }}
    • Profile picture of the author WF- Enzo
      Administrator
      How does blockchain relate to misinformation?

      Originally Posted by A7m3d View Post


      Platforms that are based on blockchain, where they can operate interchangeably across each other, and exchange and distribute information.


      -A7m3d
      Signature
      {{ DiscussionBoard.errors[11687827].message }}
Avatar of Unregistered

Trending Topics