Users manipulated in worrying Facebook study
Facebook has published details of a giant experiment it conducted in 2012, in which it secretly manipulated the emotions of 700,000 users. Should we be concerned by social media’s power?
It knows your school, your taste in films and music, where you have been on holiday and for a while it could even recognise your face in photos. But a controversial study shows Facebook’s potential power is much greater. It can be made not just to reflect a user’s mood, but to control it.
The social media giant has just released details of a vast study it conducted in 2012 which manipulated the emotions of almost 700,000 users who had no idea that they were part of the experiment.
In one test, Facebook filtered users’ home pages so that they only saw negative content from friends, such as bad news, complaints and angry rants. After a week, these people started to post relatively more negative comments themselves. But in another test, users only received happy news and comments, and after a week their posts became more positive.
The study concluded that social media can produce what it calls ‘massive-scale emotional contagion.’ By filtering what we see, social media can alter our attitudes and even our moods.
Lawyers and activists are outraged that Facebook carried out such an experiment without the consent of its users. The manipulation has been called ‘scandalous’, ‘spooky’ and ‘disturbing’, and a senior British MP has even called for a government investigation. He worries that if Facebook can be used to tamper with people’s feeds, perhaps it could be abused for political propaganda, by stifling one side of a debate while promoting another.
But others point out that search engines like Google already customise users’ search results based on their search history. With so much data online, programmers have to make big decisions in deciding what reaches us. Even some stories that make it into the news are chosen for their likely popularity and not necessarily their importance. Given that every aspect of our media experience is in some way filtered, we should not be surprised that the same applies to Facebook.
Some are alarmed by Facebook’s study. With over a billion users, the site could have a huge influence, perhaps by making people angry and inciting rebellion in fragile parts of the world. More needs to be done to regulate its potential power. At the very least users need to be aware of how easily their emotions can be manipulated.
Yet others believe that while it was unethical for Facebook to experiment on its users without their consent, people join of their own free will and do not pay to use the site. They are deluded if they think social media is ever an unfiltered reflection of reality. These tests should be a wake-up call. What is really worrying is that so many people trusted the site so unreservedly in the first place.
- Was Facebook right to experiment secretly on its users?
- Should we be worried about Facebook’s potential power to control our emotions?
- In groups, pretend you run a social media site like Facebook or Twitter. A person could receive potentially hundreds of updates from friends, but your website can only show them ten. How do you decide what is displayed? Make a list of rules.
- Creative writing: Imagine all social media is controlled to promote a sinister political message. Write a diary entry for someone who is watching people whose views are being influenced and changed by social media.
Some People Say...
“If you are not paying for something on the internet, then you are the product.”
What do you think?
Q & A
- Is Facebook controlling what I see?
- To an extent, yes. With so many potential posts and advertisements, social media companies use algorithms to determine what appears on a user’s wall and with what frequency. If you visit a friend’s page regularly, for instance, news of that friend will appear more regularly. But other than in the experiment, the site does not try to influence user’s emotions.
- What other scandals has Facebook been involved in?
- There have been many. Bullying has been a big issue: in a survey of children who have been bullied online, for 87% of them it happened on Facebook. Then there are privacy issues; even if you leave Facebook, your data remains on the system. A 2012 study also found that Facebook can be more addictive than tobacco and alcohol.
- In 2010 Facebook started using facial recognition technology for photo tagging. It caused a backlash from privacy groups and it is no longer in use in EU countries.
- It is difficult to know just how large the internet is, but one estimate suggests there are over 150m websites online. With all this data constantly being updated, social media sites and search engines have to be highly selective in what they show users.
- Important stories such as the Syrian refugee crisis and the hundreds of kidnapped schoolgirls in Nigeria are still ongoing, but without any recent major updates most news agencies tend to focus on other matters.
- The Turkish government banned Facebook and Twitter because they were being used to organise protests. In China, Facebook is blocked by a nationwide firewall so that anti-government views cannot be expressed.