By David Williams (he/him)
Ten years ago, the Arab Spring was in full bloom. People across the MENA (Middle East, North Africa) were revolting against authoritarian governments and demanding greater democratic rights. This time, protesters used a tool never before seen in revolutions: social media. They used Facebook, Twitter, and YouTube to mobilise people and spread their cause. These platforms were helpful because they evaded government censors while also quickly dispersing information. Social media was seen as a great democratising force that could chip away at the control authoritarians had on the countries' press. Moreover, the movements organised on social media had real political impacts.
The revolution that sparked the Arab Spring was in Tunisia. Street protests – organised on social media – led to the Tunisian president stepping down and allowing for parliamentary elections in the country for the first time in 24 years. But, the democratising abilities of social media were not just limited to the Arab Spring.
The term Black Lives Matter is credited to Alicia Garza, Patrisse Cullors, and Opal Tometi in 2013. This phrase soon turned into the trending hashtag #BlackLivesMatter. It has now become a rallying call under which protesters and activists fight against police brutality and racial inequality. MeToo was first coined in 2006 by activist Tarana Burke. However, #MeToo reached mainstream popularity in 2017 with Harvey Weinstein.
Social media is a promising tool to mobilise activists and spread progressive messages. However, the democratising power of social media can also bring dark forces to the surface. While politicians saw that social media could spread information holding them to account, they soon realised it could also be utilised to spread whatever information they want – whether that information is true or not.
In the past, politicians who wished to stop the spread of information could simply ban it. However, they can now just as easily spread their own narrative. This muddies the water, confuses people, and leads them into echo chambers. With the speed by which information spreads across social media, traditional media struggles to keep up. By the time the information can be proven false, the damage is already done; the narrative has taken hold and supporters will not believe otherwise. A 2018 MIT study found that “True stories were rarely retweeted by more than 1,000 people, but the top 1 percent of false stories were routinely shared by 1,000 to 100,000 people."
Anti-democratic forces take advantage of this phenomenon to spread misinformation that undermines democracy. The relentless Republican propaganda that Joe Biden stole the election from Donald Trump spread like wildfire across social media following the 2020 American election. According to a New York Times report, “Across Facebook, there were roughly 3.5 million interactions – including likes, comments and shares – on public posts referencing “Stop the Steal” during the week of Nov. 3.” Even six months later, according to a Reuters poll, “The 17–19 May national poll found that 53% of Republicans believe Trump, their party’s nominee, is the 'true president'." While not producing this content, social media companies provided the ideal platform on which such information could find an audience.
The heads of these companies, meanwhile, didn’t care. When testifying before Congress in 2019, Mark Zuckerberg said “Our policy is that we do not fact-check politicians’ speech. And the reason for that is that we believe that in a democracy it is important that people can see for themselves what politicians are saying.”
This hands-off approach allowed disinformation to run rampant on their platforms. They saw these sites as big empty spaces where, providing it doesn’t break the law, people can say whatever they want, including politicians.
Yet a key part of the social media sites allowed disinformation to spread faster than any politician could say it: the algorithm. The goal of any form of media is to keep the user engaged. For a platform like YouTube, that means people watching more videos. They designed an algorithm that would recommend videos to users that might be interesting based on the video they were watching currently. For content creators who wish to spread undemocratic messages, this was gold. They didn’t need to spread their messages because the algorithms were spreading it for them.
Former YouTube staff members have said “company leaders were obsessed with increasing engagement during those years. The executives, the people said, rarely considered whether the company’s algorithms were fuelling the spread of extreme and hateful political content.”
While social media companies maintained a hands-off approach, they were making billions from the spread of disinformation. They did not care about the status of democracy because they were not beholden to democratic institutions. They cared about making money. More views equaled more advertisement revenue.
Dr Ethan Plaut says “Social media has great democratic potential but this will likely be corrupted and squandered as long as the platform's raisond'être is to enrich distant shareholders.”
However, social media companies' hands-off approach may have reached a climax. The stolen election lie turned violent when, on the 6th of January 2021, thousands of right-wing insurrectionists stormed the US Capitol building, hoping to overturn the results of the recent election. All of them believed the “stop the steal” narratives that proliferated on social media. The next day, Twitter suspended Donald Trump’s social media account. That ban would later become permanent.
Now, social media companies are taking greater steps to stop the spread of disinformation. At the end of 2020, Twitter began adding warning labels to certain tweets that contain misinformation.
Facebook began restricting political ads a week out from the 2020 US election and created posts encouraging voting while removing any misinformation about voting.
Governments are taking steps to rein in social media companies as well. The EU is pushing a Digital Services Act which will contain rules on how big platforms regulate illegal and harmful content. The French and German parliaments passed similar legislation in 2021 and 2017 respectively.
While these changes are a good start, is it already too late? Recent studies have shown that animosity towards opposing political parties is higher than support for one’s own party. There is no doubt that social media has exacerbated these trends. To have a functioning democracy, voters at either end of the spectrum must be able to talk to each other. Otherwise, peddlers of divisive rhetoric will continue to thrive. Zuckerberg and Twitter’s Jack Dorsey opened Pandora’s box when they chose to ignore the spread of disinformation on their platforms.
Freedom of speech is something that we must all guard, but it comes with responsibility. Social media platforms can no longer hide behind the cloak of "we’re just a platform". They are part of the structure of society now. And when you are part of the societal structure, you have a responsibility to that society, and every facet of that society, including democracy.
Comments