Julia Ebner worries about punishing far-right rioters too harshly. Following the violent protests and rioting across England this August, Keir Starmer promised to “do whatever it takes to bring these thugs to justice”. Within a week, hundreds were arrested and charged with violent disorder. Some have already been sentenced to several years in prison. But Ebner, who researches far-right extremism and terrorism prevention at the Institute for Strategic Dialogue, is concerned that this will only bring extremists closer together.
“I think we’re reaching a point where there are some very strong group dynamics emerging among the protesters, especially because they face escalations with the security services,” she tells me. “That creates very strong bonding experiences.” There’s a risk of a phenomenon known as “identity fusion”—or, as Ebner puts it, “when your personal identity becomes one with the group identity. And that happens especially in times when you share very deeply emotional or transformative experiences with your peers or with your in-group.”
Before the riots, the far right was pretty fractured. There was the former English Defence League—“the older generation of anti-Islam activists and Tommy Robinson sympathisers,” Ebner says. They overlapped with the “football hooligan community” and what she describes as some younger Robinson sympathisers. And then there was Patriotic Alternative, which targets more educated, tech-savvy people who are less economically deprived—often school or university students.
Moving on to the hard and populist right, there was Laurence Fox, who is popular with conspiracy theorists, and Andrew Tate, who dominates the anti-feminist space. And of course, there was Elon Musk, with his 194m followers on X, who appeals to an audience of ultra-libertarians.
In the hours after the Southport killings, these different factions coalesced in fury.
In addition to her day job, Ebner speaks undercover to extremists, by using fake identities. In her 2020 book, Going Dark: The Secret Social Lives of Extremists, she documented her experiences of the social media channels in which the white supremacist Charlottesville rally was planned, and of a neo-Nazi music festival she attended on the Germany-Poland border. The UK far right is different to its European counterparts, she says—less keen to flirt with Nazi symbolism, for instance, because of Britain’s role in the Second World War.
In recent years, as Islamic State’s terrorist activity has waned in the west, the far right’s focus has shifted from an anti-Muslim agenda and towards a more general ethnically driven hatred. It is spurred on by the so-called Great Replacement conspiracy theory, which claims that white populations are being deliberately erased by non-whites. “We see violence directly against migrants, whether they are Muslims, whether they are Christian—it doesn’t really matter to [the far right], as long as their skin colour is different,” Ebner says. “That used to not be as acceptable, even in far-right circles, as it is now.”
Recently, too, the far right has built up “really effective communication channels”, Ebner says. Protests were planned on encrypted messaging platforms (such as Telegram and WhatsApp), fringe social media outlets (such as Gab) and alternatives to YouTube (such as Odysee). But a lot of the rage was spread on big tech platforms—primarily, X.
Since Musk took over X, he has readmitted to the site Robinson, Tate and other people previously banned for hate speech. But Ebner says the platform’s impact on the riots went beyond its lax attitudes towards racism and the spread of disinformation. Accepting that kind of speech also has a psychological effect on members of the far right, who now feel they have an ally among the tech platforms. “For the first time, they feel like they see their ideologies and their activities being normalised and legitimised,” Ebner says. “And I think that is also very dangerous, because it further emboldens them and tells them, ‘Well, actually, maybe the other platforms were just so politically biased.’ ” It suggests that even racism and white nationalism can be considered part of legitimate debate.
Fact-checking and “debunking campaigns” can be useful in countering disinformation, Ebner says, though it’s better to stop the spread of fake news in the first place. But at Big Tech companies, algorithms feed the outrage machine. Ebner says, “They prioritise the content that is most radical, most eclectic, that taps into human psychology to harvest our attention.” Why should they care about riots in the streets?