Republicans Blow Off Intelligence Hearing on Election Security With Top Silicon Valley Officials

gif;base64,R0lGODlhAQABAAAAACH5BAEKAAEALAAAAAABAAEAAAICTAEAOw== - Republicans Blow Off Intelligence Hearing on Election Security With Top Silicon Valley Officials

House Intelligence Committee Ranking Member Devin Nunes (R-CA) listens to testimony during the impeachment inquiry into US President Donald Trump on November 20, 2019.
Photo: Doug Mills (Getty)

Republicans on the House Intelligence Committee were a no-show as top officials from Facebook, Google, and Twitter testified and answered questions about their companies’ efforts to safeguard voters from the spread of disinformation concerning the 2020 presidential election.

The decision by the committee’s ranking member, Rep. Devin Nunes, to boycott the hearing was revealed roughly an hour before Facebook announced a Trump campaign ad featuring a Nazi symbol used to identify political prisoners in concentration camps had been removed from the platform for violating its policy against organized hate.

A Nunes spokesperson did not respond when asked if Republicans had advanced knowledge of Facebook’s decision or whether it had informed their apparent protest. Likewise, Facebook did not respond when asked if the White House knew the take-down was imminent.

Left to their own devices, Democrats on the intel panel, led by Rep. Adam Schiff, drilled the three tech companies over a variety of election-related issues, focusing largely on posts designed to misinform voters and efforts by foreign actors to sow divisions between groups on pressing matters of the day, including the ongoing Black Lives Matter protests.

G/O Media may get a commission

At the top of the hearing, Schiff quickly went after Google’s witness, Richard Salgado, who leads the company’s law enforcement and information security team, asking why the company was known among academics and “observers of the tech sector” as the “least transparent of the major technology companies.” Schiff compared Google to Twitter, which in 2018 began providing researchers with specific details about accounts involved in disinformation campaigns targeting American voters.

“Well, I certainly hope that’s not the perception. If it is, it’s a misperception,” Salgado replied.

Salgado pointed to YouTube’s transparency reports which provide statistics around the number of flagged videos and content removed. (Google removed 31.2 million videos last year, 27.7 million of which were removed via automated flagging.) “We have launched a bulletin that we publish quarterly that goes into influence operations that we’ve seen. We released a set just a few weeks ago,” he added.

Salgado, who declined to say whether YouTube would be willing to release more granular data on specific channels removed for spreading fake election information, noted that Google is not overall as big a target as Facebook or Twitter. However, it has released details about who is behind certain deceptive advertisements, he said, adding: “If the focus is on YouTube we can take it back and see if there’s something more useful in that arena.”

Despite having disparate policies toward election-related posts, Facebook and Twitter officials both testified to witnessing similar shifts in tactics by foreign actors since the 2016 election; namely, the increased use of authentic foreign accounts as opposed to the fake accounts that were a fixture of Russia influence operations back then. Both companies attributed the change to their own ramped-up effort to remove accounts dubbed “inauthentic,” i.e., those operating from oversees while pretending to be American.

In March, Facebook took down a fake account masquerading as the cousin of a Black man who died in police custody. The account was said to be operated out of Africa on behalf of known Russian trolls and was reportedly part of a much larger network—49 accounts and 69 pages—exposed in an investigation aided by CNN reporters.

“We have seen a change in tactics,” Twitter’s director of global public policy strategy and development Nick Pickles said. “This, in part, is a result of the success that we’ve had in punting down on the inauthentic platform manipulation operations. So activity, particularly around [covid-19], geopolitics, but also the issues in the United States surrounding particularly policing, have transferred into state-controlled media, have transferred into geopolitical space.”

As an example, Pickles said, official Chinese accounts have compared the response by U.S. police response to protesters with their own response in Hong Kong. “And so that shift from platform manipulation to overt state assets is something we’ve observed,” he said.

Facebook’s head of security policy Nathanial Gleicher concurred, adding: “We definitely see the tactics in this space evolving and we see the threat actors trying new efforts to get around the controls that are put in place. We haven’t seen coordinated inauthentic behavior on the part of foreign governments, particularly targeting voting systems or how to vote in the United States. It’s definitely something we’re monitoring.”

“We know from past disclosures that foreign actors have taken advantage of our platforms to spread misinformation which only undermines our democratic discourse,” Congresswoman Terri Sewell said. “Through your platforms, these actors can attempt to perversely influence or skew our national conversation towards chaos and confusion, and in fact, they’ve done so.”

Sewell, who is black, said that propaganda designed to suppress the black vote “has been a part of our democracy since we were able to vote.” She went on to acknowledge that while voter suppression is not a particularly new phenomenon, social media “creates the potential for such voter suppression tactics and misinformation to spread even further.” Sewell pointed to the Russian information operations in 2016 that, according to a Senate Intelligence Committee report, targeted no single group more than African Americans.

One such operation involved creating and managing a fake account that mimicked the Black Lives Matters movement. Black communities have been a target of voter suppression tactics for generations, Swell added, “always bearing the brunt when institutions like yours don’t take responsibility to stop the spread of misinformation.”

Two congressmen, Jim Himes and Denny Heck, harshly criticized Facebook in particular over its role in what Himes characterized as deepening “polarization, division, anger” among Americans with diverse political beliefs. “I’m glad everybody is doing so much work to try and identify foreign presence and all that sort of thing,” Himes said, “but I’m pretty convinced that when this republic dies it doesn’t happen because the Russians broke into Ohio voting machines or they managed to buy ads on Facebook or Twitter. It happens because our politics becomes so toxic, so polarized, we don’t recognize each other anymore as Americans.”

“All it takes is a match” lit by Russia, Iran, North Korea, or China, Himes said, once every American household is “full of toxic explosive gas, as I think it is today.” He went on to compare the type of content mainly surfaced by Facebook’s algorithms to a “car crash” that users could not avoid watching.

“Understanding how to ensure not just authentic but positive and collaborative public debate is absolutely critical, I completely agree,” said Gleicher, who denied that what Facebook users want to see most is “clickbait” and that pushing divisive content is what profits Facebook most. “They don’t want to see the kind of divisive content you’re describing,” he said.

“I’d like to see the facts behind the studies underling the notion that people don’t like divisiveness and that they don’t like clickbait,” Himes replied. “I mean clickbait is a thing because people like clickbait.”

Heck later criticized how quickly Gleicher dismissed concerns about Facebook’s platform fueling the polarization of voters. “It is axiomatic that civil discourse in America has degraded. That is inarguable,” he said. “It is also equally self-evident that the social media platforms that we are here talking about have amplified that degraded civic discourse—and as a corollary to that that you have profited.”

Heck pressed the officials to accept responsibility, or else explain why they would not. “Politicians aren’t exempt,” he admitted. “our tradecraft has fully utilized these tools to our benefit and to suggest otherwise would by hypocrisy.” But as the “bullhorn maker,” he said, Facebook not accepting responsibility seems like “a stretch.”

“Do you not accept some responsibility for this?” Heck asked point-blank.

Gleicher would not, on Facebook’s behalf, accept, but seemed to suggest instead that what Heck was describing was just human behavior and the natural result of many previously unheard voices being suddenly amplified. “Part of what you’re identifying, congressman, is how humans interact in public discussion,” he said. “It’s why we’ve taken very serious looks, it’s why we have thought about what we promote, how we promote, what we recommend, to address exactly these challenges.”

“I do also think that the rise of social media platforms, the rise of the internet has led to voices being heard at volumes that have never happened before,” Gleicher continued. “And the most difficult challenge here is how to peel these two apart: How do you mitigate some of the challenges you’re describing—and I agree these are essential challenges that we’re all grappling with—without also undermining the incredible profusion of new voices we’ve heard in public

We Know You Better!
Subscribe To Our Newsletter
Be the first to get latest updates and
exclusive content straight to your email inbox.
Yes, I want to receive updates
No Thanks!
close-link

Subscribe to our newsletter

Sign-up to get the latest marketing tips straight to your inbox.
SUBSCRIBE!
Give it a try, you can unsubscribe anytime.