Misinformation Threatens Democracy; How Do We Counter It?

Image from michalmatlon.com; found via unsplash.com

“The election was stolen.” “There is no pandemic.” “Vaccines against Covid-19 are killing thousands of people.” “Humans have nothing to do with climate change.” And on and on.

The US has long had naysayers with their own views of reality. But through a confluence of factors, including the enormous impact of social media, we are now confronting a problem whose ramifications for our democracy are vast.

The problem seems overwhelmingly diffuse and intractable. But is it? Some researchers offer rays of hope. As we need rays of hope, I willingly immersed myself in a journal article brimming with P values and data that yielded some positive findings.

(P values measure the probability that a difference is due merely to chance. Thus, a lower P value indicates greater statistical significance than would a higher one. This is not my most comfortable milieu…)

The article, in Nature, is titled “Shifting attention to accuracy can reduce misinformation online.” Its authors’ varied expertise encompasses business, media, science, information technology, and psychology.

Through seven studies, they tested a series of competing opinions about why people shared misinformation online—and how this tendency might be changed.

The first few involved whether respondents’ actions were the result of confusion—they mistakenly believed the misinformation was accurate—or a preference for partisanship over accuracy(!), or inattention to accuracy. (Emphases mine throughout.)

They provided two groups with headlines, lede sentences, and images from actual social media stories. Half of the samples were true and half were false; half were designed to appeal to self-identified Democrats and half to Republicans.

The groups, selected randomly, were asked either whether the items were true (accurate) or whether they’d think of sharing them with others.

The true headlines were rated accurate significantly more often than were the false ones. That was also the case with what they called “politically concordant” items, but to a far less significant degree.

In the group that was asked if they’d share the stories, there was a reversal: respondents were more likely to share a politically concordant headline—whether or not it was accurate.

The example the authors included was a headline: “Over 500 ‘Migrant Caravaners’ Arrested With Suicide Vests.” Although only 15.7% of Republicans rated it as accurate, 51.1% said they would consider sharing it.

Even so, at the end of the study, the preponderance of respondents said they rated accuracy “extremely important.”

The authors concluded that the selections were influenced less by an individual’s political preference and more by “inattention.”

The social media milieu moves attention more toward “other factors, such as the desire to attract and please followers/friends or to signal one’s group membership.

This breakdown, the authors write, parallels a subsequent actual study they did involving Twitter users in terms of willingness to share items in accordance with their political views regardless of veracity.

Those users were people who had linked to two right-wing sites that fact checkers regard as “highly untrustworthy”: breitbart and infowars. The researchers sent them a message seeking their opinions on one non-political headline.

The result was improved quality in the news items they later shared. It almost sounds too good to be true, doesn’t it? Of course, we don’t have information about how long this change lasted.

Similarly, the authors’ other studies, using a “treatment” group who were subtly encouraged to consider the accuracy of certain news items, and a “control” group who received no encouragement, showed twice the accuracy of the items shared by the “treatment” group as compared with the “control” group.

Even more encouraging, “the treatment effect was actually significantly larger for politically concordant headlines than for politically discordant headlines.” And even more encouraging than that, the treatment effect was significantly improved in those self-identifying as both Democrats and Republicans.

“Therefore, shifting attention to the concept of accuracy can cause people to improve the quality of the news that they share.”

The authors conjecture that people’s sharing materials they don’t firmly believe in suggests that the level of partisan fervor may be less intense than one might assume. They wisely stress the need for further research to try to determine people’s “state of belief when not reflecting on accuracy.”

They have already replicated these study findings with Covid-19 headlines and found comparable results. They recommend further work involving subjects such as organized disinformation about climate change and fraud in the 2020 US election. What an excellent idea!

Here’s their conclusion:

“Our results suggest that the current design of social media platforms—in which users scroll quickly through a mixture of serious news and emotionally engaging content, and receive instantaneous quantified social feedback on their sharing—may discourage people from reflecting on accuracy.

“But this need not be the case. Our treatment translates easily into interventions that social media platforms could use to increase users’ focus on accuracy.

For example, platforms could periodically ask users to rate the accuracy of randomly selected headlines, thus reminding them about accuracy in a subtle way that should avoid reactance… (and simultaneously generating useful crowd ratings that can help to identify misinformation…).

“Such an approach could potentially increase the quality of news circulating online without relying on a centralized institution to certify truth and censor falsehood.”

This is, of course, a single effort in a burgeoning field of study that is looking at both the external factors—social media, etc.—and the intra- and interpersonal psychosocial factors behind a very complex and very broad phenomenon.

But in the midst of lots of dire warnings that we may never emerge intact from our current morass (a mindset I fall into myself on occasion), I found this layered study quite encouraging.

How does it strike you?

Annie

25 thoughts on “Misinformation Threatens Democracy; How Do We Counter It?

  1. Annie, I just had a conversation with an aide to a GOP Congressman. After I shared my concerns over the ousting of Rep. Liz Cheney saying what a shame it is as she is a truth teller, he closed with thank you for sharing your difference on positions. I corrected him said, this is not a difference on positions – it is a difference between truth and lies. I happen to disagree with Cheney on a number of her positions, but on this issue, she is telling the truth and is being vilified for it. She is calling out the former president for his lies and seditious actions.

    To your point, we need to push back on people who are pushing misinformation, even when they do not know they are so doing. I think we can say things like:

    – I am sorry, but I am having a hard time believing that, what is your source?
    – I am sorry, but I do not put a lot of weight in the opinions of someone whose own employer called an entertainer and should not be taken seriously
    – I am sorry, but do you really believe that or are you just saying that for to get a reaction?
    – Or, I am sorry, but calling people names or labeling them does not help your argument.

    We must push back, of course, in a tactful way, but sometimes letting this lie, serves less purpose. Keith

    Liked by 2 people

    1. Good for you, Keith, for not letting that Congressman’s aide get away with his denial of the truth. And I emphatically agree that we must respond, politely, when people spout misinformation.

      What the Republicans just did to Liz Cheney must be severely challenged. This is the only thing she’s ever done that I find admirable, but she must garner support for her courage in defending democracy.

      Liked by 1 person

      1. Annie, thanks. The House leadership just gave an even bigger megaphone. And, one thing in her favor is those same folks know one key concern – she is telling the truth. Keith

        Liked by 1 person

      2. They know, Keith, but they clearly don’t care. Kevin McCarthy said at the White House today that “no one is questioning the legitimacy” of the 2020 election. So why did they get rid of Cheney? It’s a wonder he and his cohort can stand upright—not a backbone among them.

        Like

  2. This phrase of yours sums up a lot of Trump, in my opinion: “the desire to attract and please followers/friends or to signal one’s group membership.” The only words missing there are “a lust for power.” As to Liz Cheney, shameful. Agree with Keith’s comment above and kudos to him for not letting that Congressman’s aide slither away. As to your question about whether I think adding an “accuracy” rating on social media posts will help. . . it’s not enough, but it’s a start. Thanks, as always, for your fabulous work! I rate it damn accurate, and thank you.

    Liked by 1 person

    1. Thank you, my friend—most appreciated.

      In the interests of accuracy, 🙂 I must point out that the quotation you cited was from the study’s authors, not me. But I concur with your points.

      Like

  3. Hi Annie, I must admit I find the research rather tangential to the real issue – which your correspondent Keith highlighted. It’s all about power and belonging. The republicans believing in Trump’s lies are in a gang and this gives them power. They are like school kids in the yard. So consider how you might change their behavior? Whatever we say they are in their own bubble and strong in their mutual confirmation bias and smoking the same dope. What we need to find is a way of breaking up the gang. Better still turning the gang on themselves. That’s our objective. Kind Regards, David

    Liked by 1 person

    1. Good to hear from you, David. As I mention in my last paragraph, these findings constitute one approach in combating a huge issue. I agree about power and belonging, but I think the study’s findings are another part of the equation—rather than tangential. The gang has tied itself to trump because they believe he’s the only way they can get voters to re-elect them. If there’s any way to increase these voters’ attention to accuracy, at least some of them may be peeled away. Then the gang may turn on each other. We’re going to see it soon when they go after Elise Stefanik, already anointed to replace Liz Cheney.
      Cheers,
      Annie

      Like

  4. A lot of interesting stuff here. The fact that people have a strong tendency to promote information even though they themselves doubt its accuracy is all too credible in light of what I think is the key point cited — “the desire to attract and please followers/friends or to signal one’s group membership” — especially the “group membership” part. In today’s tribalized culture, for many people, the desire to assert and demonstrate one’s identity trumps other considerations, even awareness of facts. This habit isn’t limited to right-wingers, though they seem far more prone to it these days.

    It’s encouraging that there’s data supporting the idea that simply asking people to rate the accuracy of claims can reduce the impulse to share inaccurate but reassuring stories. As the conclusion says, it would be quite easy for social media companies to apply this in their day-to-day operations, if they were motivated to do so. Unfortunately their current attitude seems to resemble that of so many oil companies — our job is to maximize profit and minimize costs, and whatever pollution gets spewed into the environment as a result, whether it’s oil spills or dangerous popular nonsense, isn’t our problem.

    Regulation would help, but right now the government has too much else on its plate to be likely to spend political capital on such an issue. The threat of regulation might work better. Companies have been known to clean up their act to some extent if they feared that failure to do so would eventually bring down the heavy hand of the state upon them.

    At any rate, it’s tentatively encouraging that many of the people spreading nonsense around the internet are, on some level, aware that a lot of it is nonsense. What we really need to do is figure out how to reverse this situation where everything gets cast as a marker of tribal identity, even things which are actually matters of fact or behavioral prudence. That’s the real poison in the system.

    Liked by 1 person

    1. Infidel: The Senate has periodically held hearings with major tech company representatives and outside critics of the industry’s practices in recent years—albeit at the subcommittee level to date. It does appear that the threat of regulation has had some impact.

      A recent hearing shed some light on the changes several social media giants have made in response to pressure, as well as the important changes they still have not made. If
      I can get through several transcripts any time soon, I hope to synthesize these developments.

      Like

  5. Really interesting as always,and one of the most insightful things I have read on this phenomenon. Far too few studies have focused on how people engage with news via social media. I thought the percentages for believing and sharing were profound. Maybe it reflects a culture and media platforms that prioritise being first to share over being accurate?

    Liked by 1 person

    1. Oh, I think that’s precisely what it reflects, Matthew. And we’re all paying a high price for that “business model.”

      A big question is where we are most likely to find remedy. It seems efforts must be made to change both the business model and the culture: a monumental challenge, but at least people are seeking answers.

      Liked by 1 person

    1. No, it’s not, Mitch—but we knew there was a problem with our “collective integrity,” didn’t we?

      At least this huge issue is being examined from multiple perspectives by people seeking to ameliorate the problem. I think that’s a healthy sign.

      Like

  6. Another intriguing post, Annie. I agree that studies about how people absorb news are needed. But what boggles my mind, especially regarding the Liz Cheney issue, is the extent to which today’s GOP is peddling outright lies. I’m fairly confident that neither the former president nor his minions believe he really won the election, but their ability to sell that falsehood to their minions—including the January 6 rioters—is downright scary. It’s hard to imagine that even the most systematic investigations will lift the wool from these folks’ eyes.

    Like

    1. Thank you, Gail.
      I do think the former guy is capable of total self-deception. Being a loser is too odious for him to contemplate.

      But there are so many who know better and don’t seem to care one whit about democracy, truth, and peace. I agree: that’s scary.

      Like

  7. That is indeed encouraging news Annie!

    “platforms could periodically ask users to rate the accuracy of randomly selected headlines”

    Sounds well worth giving it a try. People like short online quizzes and I think the notion that most of us want to be accurate is right.

    Like

    1. Right, Carol. The question is whether the tech companies will be willing to incorporate this relatively simple approach. As they are worried about regulation, it’s possible they would be willing. Here’s hoping…

      Liked by 1 person

  8. Hi Annie, you may be interested in Paul Krugmans post today.

    ……….Today, of course, the Republican Party has turned into a Trump personality cult, even though Donald Trump’s actual accomplishments are hard to find. A failed nuclear deal with North Korea, a failed trade deal with China, a tax cut that never delivered the promised investment boom, a mishandled pandemic? Never mind. Obsequious professions of loyalty aren’t just expected, they’ve become a requirement for those who want to stay in the party. ………

    https://messaging-custom-newsletters.nytimes.com/template/oakv2?campaign_id=116&emc=edit_pk_20210518&instance_id=31052&nl=paul-krugman&productCode=PK&regi_id=77066008&segment_id=58365&te=1&uri=nyt%3A%2F%2Fnewsletter%2F0c5bcc4c-fb3b-5ef3-8df2-61a049ff61cb&user_id=200c468bcb1df9ed285f43809831dac0

    Liked by 1 person

    1. Thank you, David. I normally read Paul Krugman, but I missed this one. I always appreciate receiving links to relevant pieces.

      It’s amazing, isn’t it? How can people say things that they have to know are so manifestly untrue?

      Liked by 1 person

Leave a comment