BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Facebook, Google, Twitter, YouTube Need To Take Responsibility For Election Integrity

This article is more than 3 years old.

Throughout the Netflix docudrama, The Social Dilemma, a handful of technology experts and born-again Silicon Valley bros explain how social media operates on a business model that tracks and alters users’ behavior in order create a society that’s addicted to screens, all for the express goal of selling targeted ads. 

The consequences of this global addiction are existential. If you believed in free will before, you won’t after watching this film. And as the film goes down the line of the many ways social media has negatively affected society, from the rise in depression and anxiety among teens to a complete lack of data privacy to a deluge of misinformation, the most troubling has been the extreme polarization of civil discourse, especially as we near the most significant election in American history.  

One key takeaway from The Social Dilemma is clear: social media is the biggest threat to our democracy. In lieu of almost any functioning policy that regulates the technology industry and protects citizens from various harms and election interference, it should fall on the tech giants to step up to the plate. If a platform takes on election advertising, then it’s up to that platform to help regulate election integrity. Or at least, that’s how it would be in a world not ruled by surveillance capitalism.

Accountable Tech, a tech watchdog, has recently created an Election Integrity Roadmap for Social Media Platforms. The Roadmap was designed in consultation with technologists, civil rights leaders and disinformation experts to “help social media companies responsibly navigate the 2020 election season, from early voting through the official certification of results.” It focuses on using existing policies, tools and technologies to outline what specific companies can do, starting now, to address election-related content and threats. 

What social media companies can do now

The Roadmap suggests beginning straightaway by implementing a clearly defined “Election Integrity Strike System” that aims to “progressively limit the reach of serial disinformers and provide context to help inform and inoculate other users, defanging the worst actors before the most volatile period of the election season.” This includes the disinformation and misinformation published by politicians. The Roadmap directly calls out Facebook’s fact-checking exemption for politicians, a point that’s ever more salient as the president uses various digital microphones to lie about things like voter fraud or encourage citizens to vote twice.

Other immediate suggestions include expanding context about political posts and encouraging users to pause and think critically, which could help to better educate voters and slow the spread of viral misinformation while not restricting free speech. The problem is, according to the sources featured in The Social Dilemma, the algorithms of these companies aren’t exactly designed to help people analyze information. 

Rashida Richardson, director of policy research at New York University’s AI Now Institute, explains why the idea of adding unbiased and truthful context about elections is almost a pipe dream after years of psychological manipulation: “We are all simply operating on a different set of facts,” she said. “When that happens at scale, you’re no longer able to reckon with, or even consume information that contradicts with, that worldview that you’ve created. That means we aren’t actually being objective, constructive individuals.”

In the weeks leading up to the election...

Accountable Tech’s Roadmap suggests that by October, social media companies should be implementing a “Platform Poll Watchers program” to ensure the integrity of the online information sphere, utilizing nonpartisan civil society groups and experts to promote credible information regarding voting rights, election law or online disinformation, while flagging and countering false narratives in real time. Again, not exactly what these companies were designed to do, but if there’s ever a time to take a break off making money, it’s now.

“We’ve created a system that biases towards false information,” said Sandy Parakilas, former platform operations manager at Facebook, in the film. “Not because we want to, but because false information makes the companies more money than the truth. The truth is boring.”

But we need the truth, whatever that even means anymore. So the Roadmap also calls for greater transparency around specific plans to increase the capacity of content moderation, educating voters on topics like what happens after their ballot is cast and implementing a virality circuit breaker, “temporarily throttling unverified fast-spreading content and prioritizing it for immediate human review.” 

Once we hit the last couple weeks of October, social media platforms can shift their attention to advertising, placing a freeze on new advertisers until the election is over and applying the highest standards of existing transparency requirements to all ads. 

It almost sounds radical to even suggest it, given what we know about social media algorithms as an integral part of the business model, but the Roadmap also calls on platforms to limit algorithmic amplification. Some specifics include temporarily turning off algorithmic curation of Twitter timeline and Facebook News Feed, YouTube’s autoplay and Up Next engine, Twitter’s Trending Topics and Facebook Group recommendations (which is responsible for at least 64% of all extremist group joins, according to Facebook’s own internal research from 2016).

Wouldn’t it be nice if social media giants recognized that in the weeks leading up to the election, we’ll all be hooked enough to our news feeds without the extra prodding of an AI that’s specifically designed to outrage us? We’ve been through multiple recessions, a pandemic, murder hornets, climate change catastrophes and a civil rights movement. We’re outraged enough, thank you.

On election day and beyond...

Finally, in the immediate aftermath of the election, the Roadmap recommends that social media companies implement strategies to prevent online communities fomenting real-world violence, which experts say is a serious threat if the validity of the results is in doubt and misinformation continues to spread. One strategy could be to immediately flag and review before posting any content from any account with more than 250,000 followers and Election Integrity Strikes against them.  

The good news is that all of the goals outlined in the Roadmap are attainable and may actually make a difference. The bad news is that without existing policy and regulation, nothing can make social media companies adhere to these important standards and push aside the very strategies that they willfully implemented to get users addicted. As Yaël Eisenstat, former global head of elections integrity operations and business integrity at Facebook, detailed in a Guardian interview this summer, Facebook’s decisions are guided by certain factors. “One was: whatever we do has to be something that answers the criticism but never opens us up for having to take actual proactive responsibility and threaten in any way our Section 230 immunity.” (See: Facebook’s addition of a misleading label to election-related posts that doesn’t actually fact-check them.)

“Second,” continued Eisenstat, “we also need to be very careful not to bite the hand that feeds us — we’re not to anger the powers who are currently in office who could regulate us.” The only thing that Democrats and Republicans seem to agree on these days is that Big Tech needs to be regulated, but they differ in their respective qualms and approaches. Republicans are more likely to complain that conservative voices are being censored, and Facebook has rushed in to ease their fears. Facebook CEO Mark Zuckerberg has staunchly rejected the responsibility of his platforms to be “arbiters of truth” in a way that’s probably meant to seem nonpartisan but, in reality, looks like he’s throwing his weight behind the Trump administration.

“Facebook, more so than other platforms, has gone out of its way to not ruffle feathers in the current administration,” co-founder of Accountable Tech, Jesse Lehrich, told Bloomberg. “At best, you could say it’s willful negligence.”

Bloomberg also reported that Zuckerberg has told employees that Facebook is likely to fare better under Republicans than Democrats. But Zuckerberg isn’t a particularly political person, according to those who are familiar with him. He does, however, put the interests and growth of his company first, which is incredibly troubling. At the end of the day, there’s no fiscal reason why Facebook or other social media platforms should regulate themselves for the election. All we can do is continue to put pressure on our elected officials and the companies themselves, and pray that just this once, the long-term national interest is more important than the short-term financial interests of the richest companies in history.

“We in the tech industry have created the tools to destabilize and erode the fabric of society in every country all at once, everywhere,” said one of the main interview subjects of The Social Dilemma, Tristan Harris, who is a former design ethicist at Google and current co-founder of the Center for Humane Technology. “Do we want this system to be for sale to the highest bidder? For democracy to be completely for sale, where you can reach any mind you want, target a lie to that specific population and create culture wars? Do we want that?”

Follow me on TwitterCheck out my website