News

  1. News
  2. 2019
  3. March
  4. Tackling online extremism
News from City, University of London
3D illustration showing black podests with symbols that are interconnected and three red platforms with hacker symbols. David Stupples, information warfare.
Science & Technology Series: Announcements

Tackling online extremism

City’s Professor of Cyber Security, Professor Tom Chen, is helping to combat the challenges posed by far-right extremism and separatism on the Internet through the Raven academic startup.
by John Stevenson (Senior Communications Officer)

Through the Innovate UK startup accelerator programme, City’s Professor Tom Chen is developing Raven, an intelligent web crawling system for finding, identifying, and analysing extremist videos using advanced machine learning techniques.

Extremists are taking advantage of social platforms for propaganda, recruitment, and radicalisation.

Raven will assist Internet companies and law enforcement to find and take down extremist multimedia such as videos and images.

Under pressure in the aftermath of the New Zealand massacre, Facebook has pledged to block the "praise, support and representation of White nationalism and separatism" on Facebook and Instagram.

Enormous scale

While Facebook’s action has been welcomed, Professor Chen, City’s Professor of Cyber Security, says there are still significant challenges in detecting extremist propaganda, including far-right and White Nationalist propaganda:

“First, the scale of the problem is enormous. The social networks get far more content uploaded than they can self-police. The largest social media platforms such as Facebook and YouTube employ tens of thousands of human moderators to look for content that violates their terms of service. They also utilise automated AI methods to handle some of the processing.”

Professor Chen also points to ‘challenging time pressure on the moderators’:

It takes time for human reviewers to examine and classify propaganda. Automated AI methods can work faster but will always have a chance of misclassification. There will always be borderline cases that require human judgement. Added to this is the issue of freedom of speech. The social media companies do not want to be in the business of censoring their content. It is sometimes a tricky judgement call whether content should be taken down or protected by freedom of speech.

Might a more joined-up approach between national, EU and international law enforcement agencies and social media platforms be considered more useful?

According to Professor Chen, these platforms already partner with law enforcement agencies, and take referrals from the public to take down offensive content, but the enormous volume of uploaded content stubbornly remains the main problem for all parties:

“The trend will be to rely more on automated artificial intelligence (AI) methods, out of necessity, though there are ongoing debates about the risks of possible over-dependence on AI”.

Tags , , , , , , ,
Join the conversation #ProfessorTomChen#CyberSecurity#Facebook#Engineering#Raven#InnovateUK#NewZealand#Far-rightextremism
Share this article