In the weeks following the U.S. presidential election, social-media companies touted their success at preventing a repeat of the foreign interference that was rampant on their platforms in 2016. But this year’s election has sparked concerns over their response to the spread of misinformation coming from American politicians and domestic conspiracy groups – an issue that has only served to renew the political furor over social media.
U.S. senators have hauled Facebook chief executive Mark Zuckerberg and Twitter CEO Jack Dorsey in front of Congress twice in the past two months and suggested a third hearing before the end of the year. Republicans grilled the executives about whether they were biased against conservatives, while Democrats chastised them for not doing enough to stop the spread of false information.
The platforms will get their next test during two January runoff elections in Georgia that will determine which party controls the Senate. Facebook and Google have each extended bans on certain political advertisements in the run-up to the vote. But the move has done little to silence their critics.
With both Republicans and Democrats still clamouring for change, experts warn that Silicon Valley will remain in the political hot seat – and the incoming Biden administration is unlikely to offer much of a reprieve from the scrutiny.
“You’ll see a lot of congressional hearings and some proposed legislation, and that’s a necessary process to work through,” said Daniel Kreiss, principal researcher at the University of North Carolina’s Center for Information, Technology and Public Life. “I don’t think you put the genie back in the bottle on this.”
At a Senate hearing two weeks after the Nov. 3 election, Mr. Zuckerberg and Mr. Dorsey told lawmakers they felt their efforts to curb foreign interference and direct voters to credible sources of election information were successful.
The companies did appear to largely avoid a repeat of 2016, when Russian actors purchased political ads on Facebook that were viewed by millions of U.S. citizens. This year, Facebook and Twitter publicly identified a handful of influence networks from Russia, Iran, Mexico and China that attempted to interfere in the election and said they shut them down before they had amassed sizable audiences.
While many of those groups were first spotted by the FBI, which then warned the social-media companies, the platforms did seem to be more prepared heading into this year’s election, said Darren Linvill, a communications professor at Clemson University in South Carolina who specializes in tracking online disinformation.
“I think as a society we did a better job in the run-up to the election, at least in identifying and controlling foreign influence,” he said.
But the platforms struggled in the weeks after the vote to contain a torrent of misinformation about voter fraud – much of it coming from President Donald Trump and his supporters.
A Facebook group called “Stop the Steal” – linked to the online conspiracy movement known as QAnon – signed up more than 300,000 followers before it was shut down.
Twitter scrambled to slow the spread of hashtags such as #sharpiegate, a reference to a conspiracy theory that some Republican ballots in Arizona were rejected because voters had used Sharpie pens – even though election officials have confirmed that doing so would not prevent a ballot from being counted.
YouTube said last week that it would temporarily ban One America News Network, a tiny San Diego-based television station that has gained a large following on the site by claiming Mr. Trump won the election and by promoting fake cures for COVID-19.
Social-media companies also waffled during the election over how to handle a controversial report on Joe Biden’s son Hunter in the New York Post – first preventing users from sharing the stories and then reinstating the articles with an apology.
Last week, Twitter blocked a link to a lawsuit alleging voter fraud, filed by lawyer Sidney Powell, before reversing course. “Release the Kraken,” a movie reference that has become code for election-fraud conspiracies, began trending online after Ms. Powell posted it to her Twitter account.
Platforms have still not figured out how to respond to such homegrown conspiracies and to groups such as QAnon, which are comprised not of fake accounts deliberately spreading misinformation but voters promoting their unfounded beliefs, Prof. Linvill said.
“QAnon is exactly what the platforms have not got their head around yet … How do you respond to these social movements that serve to spread misinformation when a lot of it is done through innuendo, through opinion … and still allow for some semblance of freedom of speech on your platform?” he said.
Critics warn the platforms’ bans on political advertising for the Jan. 5 runoff in Georgia have made it harder for the candidates to reach voters, even as some demand the platforms do more to protect the election.
Facebook and Twitter “must expect an onslaught of the malign tactics of voter suppression and delegitimization seen in the Presidential election, and cannot backslide or regress in its moral and civic responsibility to protect our democracy,” a group of Democratic senators and former presidential candidate Bernie Sanders wrote in a letter to the platforms last week. The senators sent a separate letter to YouTube similarly urging it to prevent the spread of misinformation during the Georgia vote.
No matter which party wins control of the Senate, lawmakers are gearing up for an intense debate about the role of social media in a democracy.
Most significantly, both parties are rallying support to overhaul Section 230 of the Communications Decency Act, which prevents social-media platforms from being held liable for the content that appears on their sites.
Mr. Trump has seized on complaints that the law allows social-media companies to censor conservative speech, signing an executive order in May requiring the Federal Communications Commission (FCC) to clarify what types of content-moderation decisions are protected under the law. He called for Section 230 to be “immediately terminated” in a tweet last week, claiming the law is a threat to national security.
The liability shield has also drawn the ire of Democrats, who argue that it allows platforms to avoid removing harmful content.
In January, Mr. Biden told The New York Times that “Section 230 should be revoked, immediately should be revoked.” He also described Silicon Valley employees as “creeps.”
The president-elect is expected to name a new chair of the FCC after he takes office, an appointment that could determine the outcome of the agency’s review of the law. His picks for attorney-general and other regulatory staffers could influence antitrust lawsuits and investigations under way into the conduct of tech giants, including a Justice Department lawsuit against Google and a Federal Trade Commission investigation into Facebook.
In the face of a bipartisan backlash over Section 230, social-media companies have started offering suggestions on how to change the liability protections.
Mr. Zuckerberg has called for a requirement that platforms be more transparent about how they make decisions on removing content. “It may make sense for there to be liability for some of the content that is on the platform,” he told the Senate hearing last month.
Mr. Dorsey has argued for the creation of an appeals process for content-removal decisions and promoted what he calls “algorithmic choice” – giving users more discretion over the algorithms that prioritize the content they see in their social-media feeds.
“It seems somewhat inevitable that platforms will begin to take greater responsibility, one way or another,” wrote Brian Weiser, head of business intelligence at GroupM, the media-buying arm of global advertising firm WPP, in a research note.
Rather than quell the backlash against Silicon Valley, analysts say the presidential election only served to fuel the drive to regulate social-media platforms. “I think we are in an era of very deep conflicts over identity, over values. They’re deep and fundamental conflicts,” Prof. Kreiss said. “And I think platforms will continue to get caught in the middle of all these much larger debates.”
Our Morning Update and Evening Update newsletters are written by Globe editors, giving you a concise summary of the day’s most important headlines. Sign up today.