A Network of Extremism

You can’t make 2.4 million friends without installing a few fascists?


By: Leo Kamin

In 2004, he was using his gift of computer science and Harvard education to allow his peers to decide which of their female classmates was the “hottest.” 16 years later, 2.4 billion people use his subsequent invention — Facebook. In those 16 years, Mark Zuckerberg has graduated from a small-scale misogynist ruffling feathers in Cambridge to the man at the helm of the platform that has become the single most destructive force in geopolitics.

Facebook has been partially credited with the rise of President Donald Trump in the United States and the Brexit movement in the U.K. Jair Bolsonaro, Brazil’s far-right, notoriously-bigoted president, was swept into power on the back of a massive misinformation campaign on WhatsApp, a platform owned by Facebook. In Myanmar, where Facebook is pre-downloaded onto most smartphones and is itself considered by some to be the internet, the military churned out a barrage of false, inflammatory posts about the country’s minority Rohingya Muslim community, building popular support for an ethnic cleansing that has killed an estimated 40,000 people and forced hundreds of thousands to flee. 

These connections may at first seem to be outlandish; that’s understandable. Facebook is, to many of us, first and foremost, the site where our parents share corny memes that were maybe funny in 2014. However, it is also the main source of news and information for many. In 2017 Pew Research Center found that 45 percent of Americans got their news from Facebook. This number is likely even higher now.

Despite the number of people relying on the platform for news, Facebook has been at best unable, and at worst unwilling, to crack down on the spread of misinformation. The service has introduced some third-party fact checking, among a few other things, but the problem mostly persists. Facebook has even attempted to put the onus on states themselves to curb misinformation surrounding the election, while providing them with tools to monitor activity that, according to experts, are much too limited in scope.

It is, though, hard to expect Facebook to comb through and stamp out all posts containing misinformation among the hundreds of millions posted daily. We would be reasonable to expect, though, that the site — and more specifically, its algorithms — would not go out of their way to push users towards increasingly false, inflammatory content. But, as you can probably guess by now, that is exactly what they have done

Facebook has, for a long time, realized that for many of us, the more incendiary the content, the longer we will view it. It turns out that most of the time, the pure, unvarnished truth isn’t very interesting. You know what is interesting? A conspiracy theory claiming that high-ranking Democrats are running a child sex ring out of a Washington, D.C. pizza joint — a theory that has flourished among a number of Facebook groups, even leading a man to fire gunshots at the establishment in 2016.

Facebook has, in its own internal investigations, long recognized the divisive, radicalizing power of its algorithms, which often push extremist content and groups to susceptible users. Internal Facebook documents revealed that 64 percent of instances where one of their users joined an extremist group were a direct result of their recommendation algorithm. Despite this recognition, Facebook’s leadership has not made the necessary changes that would prevent the site from taking an active hand in the radicalization of millions. 

Led by their policy chief, Joel Kaplan, an alum of the George W. Bush administration and former oil lobbyist, Facebook ignored the internal findings and rejected many proposals to improve political discourse on the platform. Kaplan was hired in part to help assuage conservatives who believed the site was censoring their viewpoints. 

Never in human history has our information and news ecosystem been so personalized, and, as a result of time-maximizing algorithms of Facebook and other technology companies, so polarized. As vast swaths of our country increasingly believe in outlandish conspiracy theories like Q-Anon and deny the science behind masks, vaccines and climate change, it is hard to believe that Facebook’s refusal to take its role as the world largest purveyor of information seriously hasn’t had an impact. As we see the rise of extremist governments in countries that were once bastions of democracy and watch as hate crimes and political violence skyrocket, it is hard to not think of the complex algorithms steering millions of impressionable users toward groups espousing hate. 

Unless Mark Zuckerberg takes affirmative steps to combat the misinformation and hate that have spread like wildfire on his platform, history will remember him as something much worse than a petty misogynist.