By Dr Marco Bastos, Department of Sociology and Criminology, on the current problems facing Facebook in Russia

By City Press Office (City Press Office), Published

It is unsurprising that polarising content on social media is thriving during the Russo-Ukrainian war. Attrition warfare is invariably accompanied by propaganda that seeks to appear as truthful as possible.

This battle for the hegemonic narrative was plainly evident when Russian officials decided to enforce the term "special military operation" to describe the Russian invasion of Ukraine.

The dispute is likely to become increasingly more disjointed as Russians and Ukrainians push wartime narratives where facts become contested, unverifiable, or broadly assumed to be illegitimate.

In this tinderbox of a situation, it is problematic that social platforms offer no external oversight to the algorithmic routines that select content to be displayed to users.

These algorithms are optimised to prioritise content the company believes to be important to users, but the exact nature of this optimisation is obfuscated by the proprietary nature of Facebook's algorithms, with both the data and the models underpinning the algorithms being precluded from scrutiny by policymakers or the research community.

Algorithm designs include recommendation systems based on collaborative filtering (social information filtering), content-based filtering, constraint-based recommendation, and critiquing-based recommendation. These recommendation algorithms are combined to produce highly personalised results for the individual user.

Social media propaganda has, of course, picked up on these cues and tailored influence operations so that it employs relatable and familiar faces of ordinary people. Sophisticated influence operations on social media are virtually indistinguishable from organic content boosted by social media algorithms.

It is also problematic that Facebook algorithms follow intrinsic heuristics with objectives stemming from the priorities of the company rather than the strictures of journalism practice, a process that entails the distillation of hundreds of pieces of content to display to the user those that the model predicts the user will engage with the most.

Unfortunately, content that users are more likely to engage with is not content that is more likely to be accurate. If anything, this is a subset of content that is more likely to trigger emotional response.

There is also mounting evidence that Facebook's Feed has an outsized impact to hard news content, precisely the subset of content that can help to bridge the gap between propaganda and factual reality. News and politics are more likely to be affected by algorithmic changes than soft news items such as lifestyle, sports, and arts.

The asymmetric power exerted by social platforms on news organisations in the past decade is another force driving down the integrity of the information ecosystem, particularly with regard to how trusted information and news are obtained and consumed online.

This divide, unfortunately, is only likely to worsen in a context of attrition warfare.


Read more

Fact-Checking Misinformation: Eight Notes on Consensus Reality’, in Journalism Studies.

Guy next door and implausibly attractive young women: The visual frames of social media propaganda’, in New Media & Society.


About Dr Marco BastosDr Marco Bastos

Dr Marco Bastos is Senior Lecturer in the Department of Sociology and Criminology at City, University of London where he teaches media and communication theory and research methods. Before that, he held research positions at the University of Sao Paulo, University of California at Davis, and Duke University, where he is an affiliate of the Duke Network Analysis Center.

In 2017, Dr Bastos’ research showed evidence of networks of thousands of suspect Twitter bots working to influence the Brexit debate in the run-up to the EU referendum. The research became a major international news story, initially published by Buzzfeed.

His work addresses sociological aspects of digital media with a substantive interest in the cross-effects between online and offline social networks. His research brings together communication and computational social science and the source code of his projects is available on CRAN and GitHub.