The U.S. Capitol Insurrection as a Byproduct of Online Right-Wing Extremism
Right-wing extremists planned the Capitol insurrection publicly online. It was the climax of weeks of intense plotting, the majority of which took place blatantly on websites popular with far right extremists. According to U.S. News, between January 6 and January 8, the terms “civil war,” “trust the plan” and “hold the line” were invoked more than 250,000 times across social media, including Instagram, Twitter, and Reddit. This highlights the extent to which social media has become an increasingly important tool for right ring extremists to proliferate content and create networks, posing a deep threat to the security and stability of American democracy.
Jared Holt is a visiting research fellow at the Atlantic Council’s Digital Forensic Research Lab, focusing on extremist online activity. Holt noted that far right extremist websites like TheDonald, Parler, Gab, and MeWe are filled with “conspiracy theories, disinformation, and outright lies about the results of the election…and those lies came from the top arbiters of power in the Republican party, notably President Donald Trump himself.” The insurrection at the Capitol brought unparalleled traffic to some of these cites — the day of the riots, traffic itself was up by 40%.
Furthermore, according to NPR and research from Advance Democracy, more than 80% of the top posts on TheDonald the day of the insurrection called for violence in the top five responses. On January 6th, More than 1480 posts on Twitter invoked terms of violence since the 1st; on TikTok, n accounts encouraged violence and received hundreds of thousands of views.
The insurrection at the United States Capitol on Wednesday, January 6th was therefore the coming together of multiple online communities, who had been constructing immense zeal and eagerness online for such an event for years. As Vox News put it, individuals from these extremist, right-wing online communities “planned [the insurrection] on social media and, as it was happening, gleefully live streamed the destruction.”
The attack at the Capitol is not an isolated example of crowds of white supremacists accumulating and spreading their message on internet platforms like Facebook, Youtube, and Twitter, to incite deep violence. The Christchurch, New Zealand attacks — where a man killed more than fifty Muslim worshippers — was a direct result of online radicalization; domestically, events like the “United the Right” rally in Charlottesville and the Pittsburgh synanoguge shooting have their roots in white supremacist and bigoted messaging that festers online. It is also evident that extremists online are instigated and emboldened by high-profile leaders like Trump. Most clearly, these domestic attacks illustrate how the increasing influence of online, extremist groups like the Proud Boys, QAnon, and neo-nazis deeply threaten to subvert American democracy.
Online websites and social media provide a safe haven for extremist movements, groups, and individuals. The insurrection at the Capitol demonstrates how the existing approach to fighting extremism, white supremacist rhetoric, and misinformation isn’t working. Extremists and conspiracists continue to weaponize digital platforms for their messaging, and as Holt predicts, “[smaller] companies, which don’t have the same legal defenses or resources or infrastructure that a major site like Facebook or Twitter has, may falter under that pressure.” Even then, QAnon conspiracists and other extremists could not have cultivated such an immense online presence and network without being amplified by platforms like Facebook and Twitter.
It is clear that social media companies and internet websites must amp up their responses to take down hate speech, calls for violence, and white supremacist and extremist messaging on their platforms. Instead of reacting with band-aid strategies, companies must take a concerted effort to go beyond just content removal and account deactivation and be proactive in fixing this deep problem. These strategies can include detoxing algorithms, using social media to prop up counter narratives, and providing retroactive corrections; leveraging the newly established Global Internet Forum to Counter Terrorism (GIFCT) may also be a possible route to explore, as well as ensuring that less well-resourced online platforms are provided with proper security. Devising a thorough online counterterrorism strategy will undoubtedly take time and strong commitment from multiple, small and large scale online organizations and companies. However, one thing is certain — we cannot continue to passively wait for online extremism to once again translate to offline, in person chaos and violence as it did at the Capitol.