How to prevent another mass hate killing like Christchurch

Far-right hate speech is much more common online than Islamist extremism, but governments and law enforcement have not policed it to anywhere near the same extent, say Sasha Havlicek and Zahed Amanullah

How to prevent another mass hate killing like Christchurch

Far-right hate speech is much more common online than Islamist extremism, but governments and law enforcement have not policed it to anywhere near the same extent, say Sasha Havlicek and Zahed Amanullah.

New Zealand prime minister, Jacinda Ardern, has ordered an inquiry into whether police and intelligence services could have prevented the March 15 terrorist attack in which Brenton Tarrant killed 50 people.

The world will learn from the New Zealand inquiry’s findings, but all democracies should have measures to counter far-right extremism before it reaches the radar of intelligence services or police.

The Christchurch attack shocked the world, but not those of us who have studied the far-right and its flourishing, transnational online ecosystem.

Representing nearly half of all referrals to the British government’s counter-extremism programme in 2017-2018 and the vast majority of extremist-related killings in the US in recent years, the far-right has inspired violence within its ranks, while also pushing its ideas out to mainstream audiences largely unabated.

Analysis by the Institute for Strategic Dialogue, in 2017, of public Facebook pages in the UK, unearthed 40,000 accounts that engage with hateful far-right views, compared to only 2,000 that engage with Islamist extremist views. But despite the clear threat, efforts by governments and the private sector to stem Islamist violent extremism online simply have not been matched by efforts against far-right extremism.

Britain, for example, has public and private sector frameworks, and the technology and personnel to deal with a range of harmful content, including child pornography and Islamist violent extremism. But they failed to cope with the thousands of attempts to repost and modify the Christchurch terrorist’s video so as to bypass filters.

While tech giants have come under increased pressure from European governments to remove illegal content, these platforms have struggled to meet the scale of the problem.

One urgent need is for the companies to work with experts to train their human and machine-learning systems to better identify far-right content and accounts. However, this will not address the problem posed by the wider online ecosystem.

Extremists have increasingly migrated to alternative online spaces, where moderation is either limited or non-existent. Forums such as 4chan and 8chan, messaging apps such as Telegram and Gab, and gaming platforms like Discord, act as virtual safe havens for hateful propaganda and even for the mobilisation and planning of illegal activities.

Efforts need to be made by governments to ensure offline laws are applied to all these online spaces.

When it comes to existing laws addressing hate speech, harassment, and even terrorism, enforcement focuses heavily on explicit incitement to violence or affiliation with proscribed groups, both of which are a relatively small part of the problematic behavior that drives extremist sub-cultures and ecosystems.

So, what do we do about content that sits in a legal grey zone — such as the would-be manifesto of the New Zealand attacker — that is objectionable and hateful, but not necessarily illegal?

Mark Zuckerberg.
Mark Zuckerberg.

Firstly, our educational programmes need to be fit for the digital age. Improving digital literacy, and enabling people to identify propaganda and to understand how the platforms serve them content and use their data, is essential. Some programmes do this, but they are not delivered at the scale we need. Governments should integrate digital citizenship education into national curriculums.

Our second line of effort must focus on equipping civil society with the tools and skills to compete with extreme voices in our new digital ecosystem. Civil society organisations, like ours, have worked in partnership with tech companies and experts to trial innovative, community-based solutions.

These include tools for local authorities to analyse online hate speech in their communities and its impact on hate crimes, one-to-one engagements with those expressing violent extremist views, and promotion of alternative content for those searching for extremist material.

These types of projects, however, struggle to get past the pilot phase and need to be scaled up significantly to have an impact on the problem.

The virtual elephant in the room, though, is the products and algorithms deployed by tech companies that tip the scales towards hateful, extreme voices over the majority of us who oppose them. Engagement-powered newsfeeds still disproportionately influence public discourse.

Facebook founder, Mark Zuckerberg, has admitted that users engage with the most controversial content on his platform, and its architecture still encourages this.

European governments are moving closer to regulation of social media companies, with Germany having already enacted legislation (NetzDG, which fines platforms if they fail to remove flagged illegal content within 24 hours) and with Britain due to publish its plans in the coming weeks.

Their focus, so far, has only been on big tech and on content-moderation. They must also look at ways to encourage changes to the distortive impact of the platforms’ technological architecture, as well as the wider tech/media ecosystem.

Extremism is a pan-ideological phenomenon. We need to apply existing laws more effectively to the online world and dedicate the resources and solutions built up focusing on Islamist-inspired extremism to the far-right threat.

Without a more robust effort by democratic countries to come to a consensus on internet governance, virtual hate will continue to have devastating, real-world consequences.

more courts articles

Case against Jeffrey Donaldson to be heard in court Case against Jeffrey Donaldson to be heard in court
Former DUP leader Jeffrey Donaldson arrives at court to face sex charges Former DUP leader Jeffrey Donaldson arrives at court to face sex charges
Defendant in Cobh murder case further remanded in custody Defendant in Cobh murder case further remanded in custody

More in this section

Stardust nightclub fire Mick Clifford: Genuine sorrow for Stardust victims, but has anything changed?
Gaza crisis: Inhumanity on grand scale seen in denial of basic aid items Gaza crisis: Inhumanity on grand scale seen in denial of basic aid items
Trump and the risk of a US debt default Trump and the risk of a US debt default
Lunchtime News
Newsletter

Keep up with the stories of the day with our lunchtime news wrap.

Sign up
Revoiced
Newsletter

Sign up to the best reads of the week from irishexaminer.com selected just for you.

Sign up
Cookie Policy Privacy Policy Brand Safety FAQ Help Contact Us Terms and Conditions

© Examiner Echo Group Limited