Categories: Kellogg, USANews

Is It Possible to Escape On-line Polarisation and Echo chambers?

Kellogg professor of management and organisations Brian Uzzi along with a team of colleagues in Italy, recently set out to explore human behaviour within politically polarised online environments. They found that users rarely stay neutral, with the vast majority of users drawn to polarised content remarkably quickly.

The research was aimed at making a beginning in understanding and trying to dismantle the echo-chamber effect.

Social media has a tremendous effect on the minds of the masses and a tremendous responsibility to help educate us,” the Kelloggs Insight blog quotes him as saying.

Social media makes fake news real news—and has squarely helped to bring us into an era of truthlessness and ‘alternative facts,” he adds.

Meanwhile, the algorithms social-media sites use to deliver personalised content also plays a role in ensuring that users get to see only information that agrees with their beliefs. However, how much does a user’s own behaviour lead to the formation of online echo chambers?

The researchers took up a massive database of 12 million users of Facebook and Youtube. His team comprised Alessandro Bessi, Fabiana Zollo, Michela Del Vicario, Michelangelo Puliga, Antonio Scala, Guido Caldarelli, and Walter Quattrociocchi of the IMT School for Advanced Studies, Lucca, Italy.

Specifically, the research team looked at the “likes,” shares, and comments garnered by particular videos that were hosted on Youtube but also embedded on 413 different Facebook pages.

Ironically, when the Internet started operationg, lots of observers and scientists thought that accessibility to all this new information would make everybody smarter.

The team divided videos into two categories. Those appearing on Facebook pages like Illuminati Mind Control, Exposing Satanic World Government, and Doctors Are Dangerous, pages that served up controversial information without supporting evidence, were put into a “Conspiracy” category.

Videos that appeared on pages like Scientific American Magazine, the National Science Foundation or NASA, pages that promoted scientific knowledge and rational thinking, were put in “Science” category.

Next, they compared user consumption patterns of these videos on both the social media platforms. The posts and their associated user interactions from January 2010 to December 2014 were examined to assess how a particular user’s interactions with the two types of content changed over a period.

The findings revealed that on both platforms, some users only interacted with Conspiracy content, and some only with Science content. Other users, who initially interacted with both Conspiracy and Science content rapidly switched to interacting only with one or the other.

Regardless of how they started, nearly all of the users became highly polarised. The researchers defined this as happening when more than 95% of their comments, shares, and “likes” were for a single category of content. In fact, 93.6% of Facebook users, and 87.8 % of YouTube viewers fell into this category.

Prof. Uzzi says it was the speed with which users became polarised that was most astonishing. For the users who started out interacting with both Conspiracy and Science videos, “we thought they would continue to look at a variety of content. But it turned out they quickly became very selective.” This change took place roughly by the time they made their 50th comment, share, or “like,” he adds.

Even people who start out holding two points of view at the same time can very quickly go into an echo chamber,” he says.

In effect, changes in the media content people consume could quickly affect their beliefs. Facebook, YouTube and other media are constantly adjusting what users see based on what they have clicked on in the past. “Adjusting content in a particular way can lead people to make flash judgements and then hold onto those judgements, even when there is alternative information out there,” Prof. Uzzi says.

Once in an echo chamber, users do not want to come out. Once polarised, users tend to become even more so. “Inside an echo chamber, the thing that makes people’s thinking evolve is the even more extreme point of view. So you become even more left-wing or even more right-wing because the loudest voice is the thing that drives people in a certain direction,” he adds.

While debunking the false information may seem to be the solution, polarized social-media users do not seem interested in engaging with content that might contradict their beliefs. Such posts in fact, make them to cling to their positions even more fiercely.

An earlier study by some of the same Italian researchers on the commenting behaviour of more than 9 million Facebook users who were polarized to conspiracy-type thinking found that only about 100,000 of them had commented on a debunking post. And those who did interact with the debunking content tended to react by upping their activity on conspiracy-related Facebook pages.

Attempts to help social-media users escape their echo chambers have been launched recently. FlipFeed, a plug-in allows users to replace their own Twitter feeds with those of random, ideologically different strangers, and Escape Your Bubble, a plug-in that inserts opposing political views into users’ Facebook newsfeeds.

Ironically, when the Internet started operationg, lots of observers and scientists thought that accessibility to all this new information would make everybody smarter. “The idea was that since people wouldn’t have to go to a library and dig through old magazines to get information, they would explore the validity of other people’s points of view. They would do fact-checking, they would listen to both sides of the argument, and so they would get to the truth about the facts more quickly,” Prof. Uzzi says. However, that has not happened so far.(Image Source:Google.com

)