Tuesday, January 12, 2016

How Facebook makes us dumber

JANUARY 12, 2016

Why does misinformation spread so quickly on social media? Why does it not get corrected? When the truth is so easy to find, why do people accept falsehoods?

A new study focusing on Facebook users provides strong evidence that the explanation is confirmation bias: People’s tendency to seek information that confirms their beliefs and to ignore contrary information.

Confirmation bias turns out to play a pivotal role in the creation of online echo chambers. This finding bears on a wide range of issues, including the current presidential campaign, the acceptance of conspiracy theories and competing positions in international disputes.

The new study, led by Italy’s Laboratory of Computational Social Science’s Michela Del Vicario, explores the behaviour of Facebook users from 2010 to 2014. One of the study’s goals was to test a question that continues to be sharply disputed: When people are online, do they encounter opposing views, or do they create the virtual equivalent of gated communities?



Researcher Del Vicario and her coauthors explored how Facebook users spread conspiracy theories (using 32 public web pages); science news (using 35 such pages); and “trolls”, which intentionally spread false information (using two web pages).

In sum, the researchers find a lot of communities of like-minded people. Even if they are baseless, conspiracy theories spread rapidly within such communities. Generally, Facebook users tended to choose and share stories containing messages they accept and to neglect those they reject. If a story fits with what people already believe, they are far more likely to be interested in it and thus to spread it.

On Facebook, the result is the formation of a lot of “homogeneous, polarised clusters”.

Within those clusters, new information moves quickly among friends — often in just a few hours.

The consequence is the “proliferation of biased narratives fomented by unsubstantiated rumours, mistrust, and paranoia”. And while the study focuses on Facebook users, there is little doubt that something similar happens on other social media, such as Twitter — and in the real world as well.

GROUP POLARISATION

Striking though their findings are, Del Vicario and her coauthors do not mention the important phenomenon of “group polarisation”, which means that when like-minded people speak with one another, they tend to end up thinking a more extreme version of what they originally believed. Whenever people spread misinformation within homogenous clusters, they also intensify one another’s commitment to that misinformation.

Of the various explanations for group polarisation, the most relevant involves a potentially insidious effect of confirmation itself. Once people discover that others agree with them, they become more confident — and then more extreme.

In that sense, confirmation bias is self-reinforcing, producing a vicious spiral. If people begin with a certain belief and find information that confirms it, they will intensify their commitment to that very belief, thus strengthening their bias.

Suppose, for example, that you think an increase in the minimum wage is a sensational idea, that the nuclear deal with Iran is a mistake, that Obamacare is working well, that Mr Donald Trump would be a fine President, or that the problem of climate change is greatly overstated.

Arriving at these judgments on your own, you might well hold them tentatively and with a fair degree of humility. But after you learn that a lot of people agree with you, you are likely to end up with much greater certainty — and perhaps real disdain for people who do not see things as you do.

On the basis of all the clustering, that almost certainly happened on Facebook. Strong support for this conclusion comes from research from the same academic team, which finds that on Facebook, efforts to debunk false beliefs are typically ignored — and when people pay attention to them, they often strengthen their commitment to the debunked beliefs.

Can anything be done? The best solution is to promote a culture of humility and openness. Some people, and some communities, hold their own views tentatively; they are interested in refutation, not just confirmation. Moreover, those who manage social media, such as Google, can take steps to allow people to assess the trustworthiness of what they are seeing, though these efforts might be controversial and remain in a preliminary state.

In the midst of World War II, a great federal judge Learned Hand said that the spirit of liberty is “that spirit which is not too sure that it is right”. Users of the social media are certainly exercising their liberty.

But there is a real risk that when they fall prey to confirmation bias, they end up compromising liberty’s spirit — and dead wrong to boot.

BLOOMBERG

ABOUT THE AUTHOR:

Cass R. Sunstein, the former administrator of the White House Office of Information and Regulatory Affairs, is the Robert Walmsley university professor at Harvard Law School and a Bloomberg View columnist.

No comments: