“The election was stolen.” “There is no pandemic.” “Vaccines against Covid-19 are killing thousands of people.” “Humans have nothing to do with climate change.” And on and on.
The US has long had naysayers with their own views of reality. But through a confluence of factors, including the enormous impact of social media, we are now confronting a problem whose ramifications for our democracy are vast.
The problem seems overwhelmingly diffuse and intractable. But is it? Some researchers offer rays of hope. As we need rays of hope, I willingly immersed myself in a journal article brimming with P values and data that yielded some positive findings.
(P values measure the probability that a difference is due merely to chance. Thus, a lower P value indicates greater statistical significance than would a higher one. This is not my most comfortable milieu…)
The article, in Nature, is titled “Shifting attention to accuracy can reduce misinformation online.” Its authors’ varied expertise encompasses business, media, science, information technology, and psychology.
Through seven studies, they tested a series of competing opinions about why people shared misinformation online—and how this tendency might be changed.
The first few involved whether respondents’ actions were the result of confusion—they mistakenly believed the misinformation was accurate—or a preference for partisanship over accuracy(!), or inattention to accuracy. (Emphases mine throughout.)
They provided two groups with headlines, lede sentences, and images from actual social media stories. Half of the samples were true and half were false; half were designed to appeal to self-identified Democrats and half to Republicans.
The groups, selected randomly, were asked either whether the items were true (accurate) or whether they’d think of sharing them with others.
The true headlines were rated accurate significantly more often than were the false ones. That was also the case with what they called “politically concordant” items, but to a far less significant degree.
In the group that was asked if they’d share the stories, there was a reversal: respondents were more likely to share a politically concordant headline—whether or not it was accurate.
The example the authors included was a headline: “Over 500 ‘Migrant Caravaners’ Arrested With Suicide Vests.” Although only 15.7% of Republicans rated it as accurate, 51.1% said they would consider sharing it.
Even so, at the end of the study, the preponderance of respondents said they rated accuracy “extremely important.”
The authors concluded that the selections were influenced less by an individual’s political preference and more by “inattention.”
The social media milieu moves attention more toward “other factors, such as the desire to attract and please followers/friends or to signal one’s group membership.”
This breakdown, the authors write, parallels a subsequent actual study they did involving Twitter users in terms of willingness to share items in accordance with their political views regardless of veracity.
Those users were people who had linked to two right-wing sites that fact checkers regard as “highly untrustworthy”: breitbart and infowars. The researchers sent them a message seeking their opinions on one non-political headline.
The result was improved quality in the news items they later shared. It almost sounds too good to be true, doesn’t it? Of course, we don’t have information about how long this change lasted.
Similarly, the authors’ other studies, using a “treatment” group who were subtly encouraged to consider the accuracy of certain news items, and a “control” group who received no encouragement, showed twice the accuracy of the items shared by the “treatment” group as compared with the “control” group.
Even more encouraging, “the treatment effect was actually significantly larger for politically concordant headlines than for politically discordant headlines.” And even more encouraging than that, the treatment effect was significantly improved in those self-identifying as both Democrats and Republicans.
“Therefore, shifting attention to the concept of accuracy can cause people to improve the quality of the news that they share.”
The authors conjecture that people’s sharing materials they don’t firmly believe in suggests that the level of partisan fervor may be less intense than one might assume. They wisely stress the need for further research to try to determine people’s “state of belief when not reflecting on accuracy.”
They have already replicated these study findings with Covid-19 headlines and found comparable results. They recommend further work involving subjects such as organized disinformation about climate change and fraud in the 2020 US election. What an excellent idea!
Here’s their conclusion:
“Our results suggest that the current design of social media platforms—in which users scroll quickly through a mixture of serious news and emotionally engaging content, and receive instantaneous quantified social feedback on their sharing—may discourage people from reflecting on accuracy.
“But this need not be the case. Our treatment translates easily into interventions that social media platforms could use to increase users’ focus on accuracy.
“For example, platforms could periodically ask users to rate the accuracy of randomly selected headlines, thus reminding them about accuracy in a subtle way that should avoid reactance… (and simultaneously generating useful crowd ratings that can help to identify misinformation…).
“Such an approach could potentially increase the quality of news circulating online without relying on a centralized institution to certify truth and censor falsehood.”
This is, of course, a single effort in a burgeoning field of study that is looking at both the external factors—social media, etc.—and the intra- and interpersonal psychosocial factors behind a very complex and very broad phenomenon.
But in the midst of lots of dire warnings that we may never emerge intact from our current morass (a mindset I fall into myself on occasion), I found this layered study quite encouraging.
How does it strike you?