WASHINGTON, D.C. — Finger pointing began before the Jan. 6 attack on the U.S. Capitol even ended.
Many faulted then-President Donald Trump for riling up the mob and making baseless claims about election fraud. Some criticized security officials for failing to secure the Capitol, or accused the media of divisive reporting that pushed Trump supporters too far.
Others believe Silicon Valley deserves a fair share of the blame.
“Social media platforms played a role in radicalizing and emboldening terrorists to attack our Capitol,” said Rep. Anna Eshoo, D-Palo Alto. “These American companies must fundamentally rethink algorithmic systems that are at odds with democracy.”
Eshoo and Rep. Tom Malinowski, D-New Jersey, sent joint letters last week to CEOs Mark Zuckerberg of Facebook, Jack Dorsey of Twitter, Sundar Pichai of Google and Susan Wojcicki of YouTube. They called on the executives to make permanent design changes to limit the spread of radicalizing or conspiratorial content.
“Facebook, like other social media platforms, sorts and presents information to users by feeding them the content most likely to reinforce their existing political biases, especially those rooted in anger, anxiety and fear,” they wrote to Zuckerberg. “The algorithms Facebook uses to maximize user engagement on its platform undermine our shared sense of objective reality, intensify fringe political beliefs and facilitate connections between extremist users.”
In the wake of the deadly Capitol attack, social media companies cracked down on accounts that promoted conspiracy theories or other content likely to incite violence.
Twitter has removed more than 70,000 accounts related to QAnon, a far-right conspiracy theory that believes Trump is secretly fighting a global pedophile ring of Satan-worshipping Democrats. Meanwhile, Facebook targeted content that falsely states the 2020 presidential election was stolen from Trump.
Multiple platforms, including Twitter, Facebook and Instagram, even suspended Trump’s account.
Eshoo and Malinowski acknowledged the tech giants were taking new steps to prevent the spread of dangerous content. But the lawmakers said these actions are not enough.
“Content moderation on a service with more than 2.7 billion monthly users is a whack-a-mole answer to a systemic problem, one that is rooted in the very design of Facebook,” the letter to Zuckerberg states.
Chloe Meyere, a corporate communications manager for Facebook, told San José Spotlight the company has strived for years to block dangerous content.
She said Facebook took extra precautions prior to the presidential inauguration, including using artificial intelligence to demote content that likely violated company policy and increasing the requirement for group administrators to review and approve posts before they could be viewed.
“We’re keeping these measures in place,” she said.
While some believe social media companies haven’t gone far enough, others argue the current restrictions are too extreme.
The Santa Clara County Republican Party recently received a letter from a group of veterans — who asked for anonymity — sharing their concerns. The letter, which was published on the organization’s website, states that any effort to limit speech is dangerous.
“Censorship does not change anyone’s mind,” the letter states. “Nor does it make the opposition go away. It simply drives dissent underground, where it stews like a primed IED.”
It is a deeply complex issue, according to David Snyder, executive director of the First Amendment Coalition. The coalition is a nonprofit dedicated to advancing free speech, open government and public participation in civic affairs.
The First Amendment only applies to the government, Snyder said, meaning private entities like Facebook are within their rights to enforce new restrictions. But the director explained the coalition urges social platforms to give users “a long leash” so they can share ideas and opinions — unless their posts are specifically calling for violence.
“I don’t think even the most adamant free expression advocates would promote the ability of people to directly and explicitly incite violence against other people,” he said.
Snyder, however, encouraged the tech giants to clearly define and enforce their rules.
“Social media companies have not been very transparent or consistent about how their rules are applied,” he said. “There is a deep mistrust and suspicion about when and how they decide to block or delete accounts.”
Even prior to the Capitol attack, Eshoo and Malinowski pushed for more accountability from social media platforms.
The lawmakers recently introduced Protecting Americans from Dangerous Algorithms Act. The bill would amend Section 230 of the Communications Decency Act so social media platforms with more than 50 million users could be held accountable for the algorithmic amplification of radicalizing content that leads to violence.
Section 230 provides tech companies with broad legal immunity for what their users post online. Proponents say it protects free expression, while others argue it allows tech giants to be irresponsible.
At a congressional committee meeting in October, senators debated the issue and grilled Zuckerberg, Dorsey and Pichai.
Zuckerberg told legislators social media companies are in a difficult position. Democrats often say Facebook does not remove enough content, he said, while Republicans argue the company removes too much.
“There are real disagreements about where the limits of online speech should be,” he said.
Contact Katie King at [email protected] or follow @KatieKingCST on Twitter.