Monday, August 05, 2019

On Online Radicalization


The recent massacre in El Paso, Texas, carried out by a white supremacist,  turned the limelight on a hate site called 8chan.  That site is not alone among the hate sites, but it certainly is the nastiest.  Indeed, it has become the site where several recent terror attacks have been pre-announced:

In recent months, 8chan has become a go-to resource for violent extremists. At least three mass shootings this year — including the mosque killings in Christchurch, New Zealand, and the synagogue shooting in Poway, Calif. — have been announced in advance on the site, often accompanied by racist writings that seem engineered to go viral on the internet.
Public pressure forced the service provider for 8chan, Cloudfare, to take the site down.  It won't stay down, of course. 

But at least we are finally talking about the many ways that radicalization happens online.  Large social media sites were able to get together and do something about the online presence of radical Islamist terror groups.  Now they should similarly address the online presence of white supremacist terror groups* which are also breeding grounds for radicalization.

Today's New York Times editorial notes that there has been less interest in doing that:

Technology companies, too, appear unwilling to treat white nationalist terror online the way they have dealt with the online spread of radical Islamic terror groups, such as the Islamic State. Companies like Facebook and Twitter took bold action to remove tens of millions of pieces of ISIS and Al Qaeda propaganda and accounts between 2014 and 2018. Similar standards have not been applied to white nationalists, perhaps because, as a 2018 report from researcher J.M. Berger, who specializes in online extremism, notes, “The task of crafting a response to the alt-right is considerably more complex and fraught with land mines, largely as a result of the movement’s inherently political nature and its proximity to political power.”
Proximity to political power...

But it's not just the fact that some in Trump's base are white (male) supremacists that makes the regulation of online hate sites so difficult.  Practical difficulties abound:

Law enforcement currently offers few answers as to how to contain these communities. The anonymous nature of the forum makes it difficult to track down the validity of threats, and trolls frequently muddy the waters by attempting to dupe authorities with false threats and disinformation. 
And laws about online activities lag far behind our current Wild West reality.   Do sites such as 8chan bear any legal responsibility for providing a venue where terrorists can plan their crimes and new terrorists are built?  Can they be sued by the families of the El Paso massacre victims, say?

I have no idea.  But clearly the current legal and law enforcement approaches to such hate sites and the damage they do are inadequate.  It's time to change that.

Finally, Fredrick Brennan, the founder of 8chan stated recently that his initial goal with the site was to create a free speech utopia:

Mr. Brennan, who has claimed that he got the idea for 8chan while on psychedelic mushrooms, set out to create what he called a free speech alternative to 4chan, a better-known online message board. He was upset that 4chan had become too restrictive, and he envisioned a site where any legal speech would be welcome, no matter how toxic.

Mr. Brennan, who is no longer affiliated with the site now wants it shut down.

I wonder what he thought a free speech "utopia" site with pretty much no moderation would produce if not the fruits that we are now harvesting**.   



--------

* Those sites are even more hateful than that, if possible:

The result is an evolving brand of social media-fueled bloodshed. Online communities like 4chan and 8chan have become hotbeds of white nationalist activity. Anonymous users flood the site’s “politics” board with racist, sexist and homophobic content designed to spread across the web. Users share old fascist fiction, Nazi propaganda and pseudoscientific texts about race and I.Q. and replacement theory, geared to radicalize their peers.

**  My impression is that any online political commenting site without moderation ends up not as a free marketplace of ideas but as something a little like the market for lemons, though not for quite the same reasons: 

Bad speech drives out better speech, trolls take over because they are allowed to and have more time and stamina, and extreme opinions, often toxic ones, end up dominating the debates.  Finally, only the bottom feeders remain, patrolling the area for new victims.