Facebook, Google and Twitter are on the hot seat.
Facebook General Counsel Colin Stretch; Sean Edgett, Twitter’s top lawyer; and Richard Salgado, Google’s director of law enforcement and information security, appeared at a Senate Judiciary subcommittee hearing on Tuesday to discuss how Russia used these companies to influence the 2016 presidential election through disinformation and fake news.
It’s the first of three hearings. On Wednesday, the three executives will testify before the House and the Senate Intelligence Committee in separate sessions (Follow CBS News’s live blog of the testimony here).
Congress is looking to hold Silicon Valley accountable for influence Russia was able to wield using their respective platforms. Facebook said in its opening statement that 126 million people — a third of the nation — viewed Russian-backed content. Twitter, meanwhile, confirmed that there were more than 2,700 Twitter accounts associated with the Internet Research Agency, a Russian-backed troll farm. Altogether, those trolls have spread propaganda and fake news that’s garnered more than 414 million impressions on Facebook and Twitter.
“We had a foreign government apparently buying thousands of dollars of advertising to create discontent and discord,” Republican Sen. Lindsey Graham of South Carolina said as he opened the subcommittee hearing Tuesday.
The hearings are the latest twist in the high-profile investigation into Russia’s influence over the US election. At issue is how much the Russian government may have attempted to influence the electorate and whether President Donald Trump or anyone working for him was knowingly involved. Trump has repeatedly denied involvement.
Disinformation has long been a part of Russia’s foreign policy strategy, and social media has allowed the trolling effort to expand on a viral scale. US intelligence has warned Congress that these campaigns will continue in future elections.
The Russian trolling campaigns involved people posing as advocates on hot-button issues, like Black Lives Matter protesters, gun rights groups and LGBT issues. The most popular Russian-backed account was @TEN_GOP, which amassed more than 100,000 followers while .
Google said Russian trolls “uploaded over a thousand videos to YouTube on 18 different channels.”
Facebook previously acknowledged that Russians spent more than $100,000 on 3,000 ads that reached 10 million users. But in written testimony submitted to the committee, Stretch acknowledged that paid advertisements represented only a small fraction of the posts instigated by Russian operatives. He said that between 2015 and 2017, a single Russian operation in St. Petersburg generated about 80,000 posts that were seen by roughly 29 million people.
Because those posts were liked, shared and commented on by Facebook users, the company estimates that roughly 126 million people could have seen the posts, which were disguised as American political commentary.
Similarly, Twitter’s Edgett said the company identified 36,746 accounts that between Sept. 1 and Nov. 15, 2016, generated approximately 1.4 million automated, election-related Tweets, which collectively received approximately 288 million impressions or views.
But Stretch and Edgett tried to downplay the impact of this activity on their platforms. Stretch said the content shared through Facebook was a tiny fraction of what users see every day in their Facebook news feeds.
Edgett made the same argument.
“We determined that the number of accounts we could link to Russia and that were tweeting election-related content was small in comparison to the total number of accounts on our platform during the relevant time period,” he said in his testimony. “Similarly, the volume of automated, election-related tweets that originated from those accounts was small in comparison to the overall volume of election-related activity on our platform, with significantly fewer impressions as compared to a typical election-related tweet.”
Still, the companies, which initially dismissed the importance of fake news spreading unchecked, said they’re committed to ensuring this type of activity is stopped.
“When it comes to the 2016 election, I want to be clear: The foreign interference we saw is reprehensible and outrageous and opened a new battleground for our company, our industry and our society,” Stretch said in his statement. “That foreign actors, hiding behind fake accounts, abused our platform and other internet services to try to sow division and discord — and to try to undermine our election process — is an assault on democracy, and it violates all of our values.”
Twitter said that in 2016 following the election, it launched the Information Quality initiative to improve its system for detecting bad automation. It also improves machine learning to spot spam, and increases the precision of tools designed to prevent such content from spreading on its platform, it said. The company said that since the 2016 election, it has made significant improvements to reduce external attempts to manipulate content visibility.
But lawmakers have said the companies need to do more. Earlier this month, Sens. Mark Warner, a Democrat from Virginia, and Amy Klobuchar, a Democrat from Minnesota, introduced the Honest Ads Act, which would require social networks to meet the same standards that political ads on TV and radio must meet.
Fearing unwanted regulation, Facebook and Twitter pre-emptively adopted new policies days before the hearings this week. Facebook promised more transparency for its political ads. Twitter announced it was blocking ads from Russia-sponsored news sites Russia Today and Sputnik.
Whether the companies can stave off regulation will depend in large part on their performance this week before lawmakers. During the tech giants’ first testimony on Capitol Hill on Oct. 4, senators slammed the three companies for not taking the threats seriously.
“Frankly, their initial reaction to Russian involvement has been a bit embarrassing,” Sen. Mark Warner, the top Democrat on the Senate Intelligence Committee, said in a Q&A with CNET on Friday. “They underestimated how serious the problem was.”
iHate: CNET looks at how intolerance is taking over the internet.
Special Reports: CNET’s in-depth features in one place.