False, fake and fibs: The science behind misinformation
Researchers from City and the University of Oxford demonstrate what social science can tell us about misinformation, and what we can do to minimise its impact on us.
Last year ‘fake news’ was the phrase on everybody’s lips, but just how susceptible are we to its influence, how does it spread, and what are best ways of limiting its impact on us as individuals and as a society?
The ‘False, fake and fibs: The science behind misinformation’ event was held at City, University of London, as part of the Economic and Social Research Council’s (ESRC) Festival of Social Science 2018.
Three experts in the field shed light on this topical issue, by sharing their insight and scientific findings.
How do we process corrections to misinformation?
City’s Saoirse Connor Desai focused on how we process corrections to misinformation.
She used examples such as the 2016 ‘Brexit Bus’ advertisement which claimed that the UK sends £350 million a week to the European Union.
Despite the claim being widely criticised by the UK Statistics Authority, an Ipsos MORI poll taken just before the Brexit referendum suggested that half those polled believed the claim to be true.
She also shared research suggesting that corrections of misinformation often fail as they render a gap in the story originally set out by the misinformation. A more effective way of correcting misinformation is to share a plausible, alternative narrative which does not leave a gap in the story and people’s minds.
Is reasoning the enemy? The neuroscience behind rejecting unwanted information
Dr Andreas Kappes explained the neuroscience behind why people might reject unwanted information.
In his own research project, participants were shown details of a property and told it’s price. They were then asked to judge whether it was worth more or less than the listed price and to bet fake money on how confident they were with their decision. They were then partnered with a second participant who was asked to do the same thing.
Dr Kappes found that if the second person’s judgement of the property’s value was the same as that of the first, the first participant would take on board the amount their partner bet and come to a compromised final wager. However, if the second person’s judgement was different, the first participant was likely to ignore the value of their bet altogether.
Dr Kappes also ran the study with participants in an fMRI brain scanner. It showed that when the partner agreed with the first participant’s judgement, part of the frontal cortex of their brain was activated. When the partner disagreed there was no such activation. This provides objective evidence of the brain behaving differently when presented with wanted versus unwanted information.
The audience were also invited to take part in a version of this game with each other at the event, using their mobile phones. The result echoed the findings of Dr Kappes research.
How can people get and maintain wrong beliefs?
Dr Jens Koed Madsen, University of Oxford, then discussed how the ‘cognitive models’ of thinking described by the speakers from City could be integrated into simulations of individuals interacting in social networks.
He shared how the interaction of people on social media can be considered a ‘complex’ system, meaning that although the individual interactions between people may be simple, it can be very difficult to understand how all these result in wider behaviours of a population of people. He used the analogy of how a peaceful demonstration might turn into a riot; it may start off with a few demonstrators (and perhaps police) nudging and pushing each other, but which can develop over time into an extreme situation.
Dr Madsen suggested that computer simulations of social networks could help us understand how people can get and maintain wrong beliefs. This includes how people self-sort themselves into ‘echo chambers’ of wrong beliefs based on misinformation, whether it be on social media, or in person.
Findings from his research suggest that individuals who self-organise themselves into echo chambers of extremist belief do not necessarily differ in their cognitive functions (thinking ability) to those who do not. The findings also suggest that when individuals are able to make more possible connections in their network, larger echo chambers are seen, despite there being more access to rational individuals.
It is therefore important to understand the processes by which echo-chambers develop, and not look solely at their outcome.
Reflections from the speakers
Dr Jens Koed Madsen appreciated the audience involvement and discussion topics. He said:
The audience participated well and posed very relevant questions. For example, we had an interesting debate on the role of media, the perceived equal time for any issue, and various ideas for curtailing the spread of misinformation. In addition, we discussed the role of education in combatting the spread of misinformation, including teaching children to reason critically and the potentially devastating effect of the so-called 'teach the controversy' principle concerning the theory of evolution versus unscientific ideas such as creationism.
Dr Andreas Kappes was delighted with the outcome of the event, and said:
“I was hoping that beyond telling a broader audience about my research, that I was reminded of why the research we do is important, and how it connects to other areas of society. Having discussions after the event with people from journalism, and education, among others, was brilliant and invigorating for me as a researcher and I'm looking forward to the next one.”
Saoirse Connor Desai agreed:
“It was great to see so many people interested in our work and ask so many insightful questions, it really gave me some food for thought!”.
More from the ESRC Festival of Social Science 2018
To find out more about the festival, including the 302 other events held, visit the ESRC website.