Research reveals Facebook ‘echo chambers’

Heard it all before: The way we use social media may be narrowing our minds.

A Facebook study suggests that users are disproportionately accessing opinions which reinforce what they already believe. Are social media narrowing our minds?

‘Who on earth votes UKIP?’

‘Who voted Conservative — how did they get so many votes here?’

‘Urgh. Who votes for more austerity, more food banks, more wealth for the 1%, fewer public services and the death of the NHS?’

As the election results broke on Thursday night, some social media users with left-wing political views were left dumbfounded. Who were the millions of voters who had disagreed with them? How could the Tories win and UKIP gain nearly four million votes? It seemed incomprehensible.

Some of their bewilderment may be the result of the ‘echo chamber’ effect of social media: Facebook and Twitter feed their users a high proportion of articles and posts which reinforce their existing views and prejudices. Research for Facebook confirms what many have suspected — and suggests that our choices are more responsible for creating the phenomenon than the website’s algorithms.

The study suggests that our choice of friends is by far the most important reason why we are exposed to a limited range of opinions, as the average Facebook user who declares a political allegiance has a similar position to 80% of their friends who also declare one. Facebook’s algorithms then reduce the amount of ‘cross-cutting’ content — material from points of view different to our own — slightly further. From the articles available, people then select a high proportion of those that they are inclined to agree with.

So social media may have the potential to hinder democratic debate and encourage conflict. When we only discuss our views with those who agree with us, it is believed that it makes us more certain of ourselves and moves us to take extreme positions. Although social media may appear to provide a forum where we can share a variety of points of view, in reality they may simply be somewhere for opposing groups to reassure themselves that they are right while ignoring each other.

Face off

This trend has been seen as a damning indictment of social media. We are pretending to ourselves that Facebook and other online platforms are liberating us and allowing us to broaden our minds, when in reality they are achieving the opposite. As a result, the world is a poorer place with less chance for reasonable discussion to take place.

But perhaps that is abdicating responsibility to the technology. This study shows that our choices are the real problem, not the social media themselves. If we choose to join Facebook, choose to make friends with those like us and then choose to read things we agreed with in the first place, we only have ourselves to blame if we end up narrow-minded and arrogant in our opinions.

You Decide

  1. Do social media sites broaden or narrow minds?
  2. Do you learn more by reading the news on Facebook or Twitter, or on more traditional media?

Activities

  1. Think of an issue which you feel passionately about (such as the election). Then write a letter to yourself from someone who strongly disagrees with you on the subject, explaining their point of view.
  2. Create, or design on paper, a web page on a subject of interest to you. Provide links to articles reflecting a diverse range of opinions on the subject and explain why you chose them.

Some People Say...

“Individuals must assume responsibility for the programming media they consume’.”

Art Silverblatt

What do you think?

Q & A

Is this just a problem on social media?
No, because published material can never be truly objective or cover every possible point of view on an issue. Newspapers, for example, tend to take an editorial line (some more subtly than others), meaning that they disagree with each other over the relative importance of stories — and this line is most heavily influenced by what their readers want to read. This problem has always existed, but social media may be fooling people in to thinking that it no longer does, when in reality it is getting worse.
How reliable is this research?
The finding that we seek opinions similar to our own is backed up by academics. But the reasons for it will require further investigation, particularly as this research was conducted by Facebook itself.

Word Watch

The website’s algorithms
Facebook uses a complex mathematical formula which helps to filter content which its users see. The company is secretive about how its algorithm works, but argues that it is essential to allow users to have the best experience of the website possible without being overwhelmed by material they do not care about. It also helps to ensure that advertisers, whose money funds the site, target the people most likely to respond to them.
‘Cross-cutting’ content
Facebook used people who identified their political views as ‘liberal’ or ‘conservative’ in the study. ‘Cross-cutting’ content was anything considered to be from the opposite side of the political spectrum.
Take extreme positions
Harvard law professor Cass Sustein says that, when we consistently deliberate with those who agree with us, it tends to move us ‘toward a more extreme point in the direction indicated by our own pre-deliberation judgments.’