Online Extremism Persists: Examining Why Efforts to Combat It Fall Short

` tags.

``

Rear view of woman playing video games on laptop late at night

The typical response to shootings connected to online hate or the spread of violent networks on social media is that platforms need to do more.

My research indicates that while content moderation takedowns have increased on major sites, extremists still find online spaces to recruit, organize, and incite violence. Perhaps the question should be whether the problem is too big for any single platform to handle.

The focus is too often on individual platforms like Facebook, X, YouTube, or TikTok, and not enough on the fragmented nature of content moderation across the internet. When governments scrutinize “Big Tech” and platforms increase moderation, extremist movements move to smaller, alternative platforms. Fewer rules and smaller safety teams mean more opportunities to radicalize and to test ways of getting back on larger platforms.

Recently, some major platforms have loosened content moderation rules under the banner of free speech. For example, under Elon Musk, X significantly reduced its trust-and-safety teams. Meta ended its third-party fact-checking program and redefined hate speech policies. Because these platforms have the widest reach, extremists regain access to mainstream audiences and re-enter the cycle of radicalization that smaller platforms struggle to maintain.

My research, using multi-platform data and case studies, shows how extremists build resilience in this uneven landscape. Their strategy involves using fringe sites or encrypted messaging apps to post the most inflammatory material. Then, they create “toned-down” messages for mainstream platforms that are hateful but not enough to trigger takedowns. They exploit the resentment of users who feel censored, using that as part of their rallying cry. This thrives in the “inconsistent enforcement system,” allowing extremists to evade bans and rebuild across platforms.

This piecemeal approach means extremist movements are never truly dismantled, only displaced. Instead of weakening them, it teaches them to evolve, making future enforcement harder.

If platforms coordinate their standards in specific enforcement protocols, it removes the “arbitrage” extremists rely on. Analyses of 60 platforms show that policy convergence reduces safe havens for violent groups because they can’t exploit enforcement gaps. When platforms coordinate, extremists have fewer places to regroup and shift when bans occur.

Coordination isn’t simple; content moderation raises concerns about free speech and potential abuse. However, for content like terrorist propaganda and hate speech advocating violence, aligning standards would close loopholes.

Building trust-and-safety capabilities isn’t cheap, especially for smaller platforms. ROOST, supported by philanthropic foundations and tech companies like Google, OpenAI, and Roblox, provides open-source software and shared databases to help platforms identify and remove extremist material. This promises greater convergence without companies reinventing moderation.

Political barriers remain. There’s no consensus on the line between extremist speech and legitimate political expression, and views are polarized. Extremist violence isn’t partisan; attacks linked to online radicalization show that hate and terror thrive between platforms.

We can agree that calls for violence, hate-based harassment, and terror propaganda warrant intervention. Multi-platform initiatives like ROOST or the Global Internet Forum to Counter Terrorism can make headway there.

Until the incentives enabling the migration of extremist content across platforms are addressed, we’ll keep asking why these attacks happen. The answer is a fragmented system where each platform fights its own battle, while extremists exploit the seams.

It’s time to demand not just that “Big Tech do more,” but that all online spaces commit to a more unified stance against extremism. Only then can we begin to dry up the countless leaks that keep nourishing digital hate.