The Zoom Privacy Backlash Is Only Getting Started

The popular video conferencing application Zoom has been having a Moment during the Covid-19 pandemic. But it’s not all positive. As many people’s professional and social lives move completely online, Zoom use has exploded. But with this boom has come added scrutiny from security and privacy researchers—and they keep finding more problems, including two fresh zero day vulnerabilities revealed Wednesday morning.

The debate has underscored the inherent tension of balancing mainstream needs with robust security. Go too far in either direction, and valid criticism awaits.

“Zoom has never been known as the most hardcore secure and private service, and there have certainly been some critical vulnerabilities, but in many cases there aren’t a lot of other options,” says security researcher Kenn White. “It’s absolutely fair to put public pressure on Zoom to make things safer for regular users. But I wouldn’t tell people ‘Don’t use Zoom.’ It’s like everyone is driving a 1989 Geo and security folks are worrying about the airflow in a Ferrari.”

Zoom isn’t the only video conferencing option, but displaced businesses, schools, and organizations have coalesced around it amid widespread-shelter-in place orders. It’s free to use, has an intuitive interface, and can accommodate group video chats for up to 100 people. There’s a lot to like. By contrast, Skype’s group video chat feature only supports 50 participants for free, and live streaming options like Facebook Live don’t have the immediacy and interactivity of putting everyone in a digital room together. Google offers multiple video chat options—maybe too many, if you’re looking for one simple solution.

At the same time, recent findings about Zoom’s security and privacy failings have been legitimately concerning. Zoom’s iOS app was quietly—and the company says accidentally—sending data to Facebook without notifying users, even if they had no Facebook account. The service pushed a fix late last week. Zoom also updated its privacy policy over the weekend after a report revealed that the old terms would have allowed the company to collect user information, including meeting content, and analyze it for targeted advertising or other marketing. And users have been creeped out by Zoom’s attention-tracking feature, which lets the meeting host know if an attendee hasn’t had the Zoom window in their screen’s foreground for 30 seconds.

During the pandemic, a type of online abuse known as Zoombombing, in which trolls abuse Zoom’s default screen-sharing settings to take over meetings—often with racist messages or pornography—has also spiked. Zoom offers tools to protect against that sort of assault, specifically the option to password-protect your meeting, add a waiting room for vetting attendees, and limit screen sharing. Some paid and free speciality versions of the service, like Zoom for Education, also have different screen sharing defaults. But in general the service doesn’t highlight these options in a way that would make them intuitive to enable.

“It’s as though, in suddenly shifting from the office to work from home, we didn’t so much move the conference room into our kitchens as into the middle of the public square,” says Riana Pfefferkorn, associate director of surveillance and cybersecurity at Stanford’s Center for Internet and Society. “Enterprise platforms are now seeing the same abuse problems that we’ve long been used to seeing on Twitter, YouTube, Reddit, etc. Those platforms were inherently designed to let strangers contact other strangers—and yet they had to tack on anti-abuse features after the fact too.”

Perhaps most jarring of all, the service has a security feature that it falsely described as being “end-to-end encrypted.” Turning on the setting does strengthen the encryption on your video calls, but does not afford them the protection of being completely encrypted at all times in transit. Achieving full end-to-end encryption in group video calling is difficult; Apple memorably spent years finding a way to implement it for FaceTime. And for a service that can support so many streams on each call, it was always unlikely that Zoom had actually achieved this protection, despite its marketing claims.

Zoom did not return a request for comment from WIRED about how it is handling this deluge of security and privacy findings in its product. On Thursday, though, Zoom founder and CEO Eric Yuan wrote in an extensive public statement that the company is pausing feature development so its engineers can focus solely on security and privacy improvements. Yuan said that over the next 90 days, that Zoom will also conduct third-party security audits and penetration tests, expand its big bounty program, and prepare a transparency report on data requests the company has received from entities like governments and law enforcement.

The recent blemishes are compounded by the fact that even before the pandemic, Zoom had a reputation for prioritizing ease of use over security and privacy. Notably, a researcher revealed flaws last summer about how Zoom seamlessly joined users into call links and shared their camera feeds without an initial check to let users confirm they wanted to launch the app. That means attackers could have crafted Zoom links that instantly gave them access to a user’s video feed—and everything going on around them—with one click. The research also built on previous Zoom vulnerability findings.

Zoom’s gaffes have also started to invite even more potentially consequential scrutiny. The company is facing a class action lawsuit over the data its iOS app sent to Facebook. And the office of New York attorney general Letitia James sent a letter to the company on Monday about its mounting punch list. “While Zoom has remediated specific reported security vulnerabilities, we would like to understand whether Zoom has undertaken a broader review of its security practices,” the attorney general’s office wrote.

Given this track record and all the commotion about Zoom security in the past few weeks, macOS security researcher Patrick Wardle says he recently got interested in poking at the Mac desktop Zoom app. Today he is disclosing two new security flaws he found during that brief analysis.

“Zoom, while great from a usability point of view, clearly hasn’t been designed with security in mind,” Wardle says. “I saw some researchers tweeting about strange Zoom behavior and literally within 10 seconds of looking at it myself I was just like, aw, man. Granted I research this stuff, so I know what to look for. But Zoom has just had so many missteps, and that’s very indicative of a product that has not been adequately audited from a security point of view.”

Wardle’s findings pose limited risk to users in practice, because they would first require the presence of malware on a target device. One attack focuses on a Zoom installation flow that still relies on a now-retired application programming interface from Apple. The company deprecated the API because of security concerns, but Wardle says that he sometimes still sees products using it as a lazy workaround. An attacker who has infected a victim device with malware, but hasn’t yet achieved full access, could exploit Zoom’s insecure install settings to gain root privileges.

The other vulnerability Wardle found is also significant, though still only a local access bug. macOS offers a feature called “hardened runtime” that lets the operating system act as a sort of bouncer while programs are running and prevent code injections or other manipulations that are typically malicious. Developers can choose to add exemptions for third-party plugins if they want to have that additional functionality from an external source, but Wardle notes that such exceptions are typically a last resort, because they undermine the whole premise of “hardened runtime.” Yet Zoom’s macOS application has such an exemption for third-party libraries, meaning malware running on a victim’s system could inject code into Zoom that’s trusted and essentially link the two applications—allowing the malware to piggyback on Zoom’s legitimate microphone and video access and start listening in on a victim or watching through their webcam whenever the malware wants. Wardle confirmed that Zoom had fixed both flaws by Wednesday night.

Though it doesn’t look like researchers will stop finding flaws in Zoom any time soon, the most important takeaway for regular users is simply to think carefully about their security and privacy needs for each call they make. Zoom’s security is likely sufficient for most people’s general communications, but there are more protected group video chat options—like those offered by WhatsApp, FaceTime, and particularly Signal—that could be a better fit for sensitive gatherings.

“The reality is that companies are going to have mistakes in their software,” says Jonathan Leitschuh, a security researcher who found the webcam hijacking flaws in Zoom last summer. “The more criticism of a platform, the more secure it’s hopefully going to be. So hopefully Zoom is taking the information that they’re gaining and actually acting on it. But if you need to be secure and secret, I would not recommend you have those conversations over Zoom. Use a platform that’s built for the level of security you need.”

Updated April 2, 2020 at 9:50 am ET to include Zoom’s public statement and Wardle’s confirmation that two vulnerabilities had been patched.


More Great WIRED Stories

We Know You Better!
Subscribe To Our Newsletter
Be the first to get latest updates and
exclusive content straight to your email inbox.
Yes, I want to receive updates
No Thanks!
close-link

Subscribe to our newsletter

Sign-up to get the latest marketing tips straight to your inbox.
SUBSCRIBE!
Give it a try, you can unsubscribe anytime.