The myth of responsible encryption: Experts say it can’t work – CNET

security-privacy-hackers-locks-key-6777

Governments want tech companies to create a master key that only law enforcement can use. Security experts say it’s a fantasy.

James Martin/CNET

Governments want to have their cake and eat it too.

Many support a concept called responsible encryption, which, the idea goes, would provide complete privacy and security for people while also letting law enforcement see encrypted messages to better protect you.

Sounds fantastic, right? Unfortunately, security specialists say it’s a paradox.

Yet the concept continues to rear its head. The most recent responsible-encryption advocate is US Deputy Attorney General Rod Rosenstein. During a speech to the US Naval Academy on Tuesday, Rosenstein called out tech companies for refusing to help with uncovering private messages.

“Responsible encryption can protect privacy and promote security without forfeiting access for legitimate law enforcement needs supported by judicial approval,” he said, according to a transcript.

Rosenstein isn’t alone. Officials in Australia and the UK have also called for responsible encryption, despite the fact that both governments have suffered major breaches that shatter the concept.

Responsible encryption, according to the lawmakers who demand it, would require companies to create a secret key, or back door, that would make it possible to read coded data. Only the government could access the key, so that with the proper warrant or court order, law enforcement could read through messages. The key would be kept secret — unless hackers stole it in a breach.

Companies like Apple, WhatsApp and Signal provide end-to-end encryption, meaning people can chat privately, with their messages hidden even from the companies themselves. Such encryption means that only you and the person to whom you sent your messages can read them, since no one else has a key to unlock the code.

End-to-end encryption provides security and privacy for people who want to make sure no one’s spying on their messages — a desire some would call modest in an age of mass surveillance. Governments around the world have a problem with that though.

Rosenstein instead sees a future where companies keep their data encrypted, unless the government needs data to investigate a crime or a potential terrorist attack. It’s the same rallying cry UK Prime Minister Theresa May made after a June 4 terrorist attack that took place on the London Bridge. May blamed encryption for providing a safe space for extremists.

Rosenstein uses password recovery and email scanning as examples of responsible encryption. But neither of those involve end-to-end encryption. He references an unnamed “major hardware provider,” which “maintains private keys it can use to sign software updates for each of its devices.” And then he touches on a major problem with responsible encryption: Creating a back door for police also means creating an opening for hackers.

“That would present a huge potential security problem, if those keys were to leak,” Rosenstein said. “But they do not leak, because the company knows how to protect what is important.”

Except these important files have leaked on multiple occasions, including from the US government itself.

Adobe accidentally released its private key on its security blog in September. In 2011, RSA’s SecurID authentication tokens were stolen. The notorious malware Stuxnet used stolen encryption keys to install itself. The US National Security Agency has fallen victim to multiple breaches, from Russian spies stealing its secrets to the Shadow Brokers hacker group selling the agency’s tools.

“When the companies have the keys, they can be stolen,” said security researcher Jake Williams, founder of cybersecurity provider Rendition Infosec. “Law enforcement calls [end-to-end encryption] ‘warrant proof crypto,’ but many companies will tell you they’re not trying to dodge a warrant, they’re just doing what’s right for security.”

It’s why Apple refused to create a back door for the FBI in 2016, when the agency wanted to crack into an iPhone belonging to one of the shooters in the San Bernardino terror attack. Apple CEO Tim Cook said last year that the back door is “the equivalent of cancer,” arguing that the master key could be stolen and abused by hackers, like it had been in previous cases.

It’s unclear why Rosenstein seems to think encryption keys can’t be stolen. The Justice Department confirmed Rosenstein’s comments and declined to elaborate.

The call for encryption loopholes has alarmed the security community, which says it’s experiencing deja vu.

“I think it’s extremely concerning that the man responsible for prosecuting crimes on the federal level would expect the invasion of everyone’s privacy simply to make law enforcement’s job easier,” said Mike Spicer, an expert and founder of the security company Initec.

The myth resurfaces nearly every year, said Eva Galperin, the cybersecurity director at the Electronic Frontier Foundation, a digital-rights group. Every time, the EFF slams the demand, saying it’s a “zombie argument.”

“Calling it responsible encryption is hypocritical,” Galperin said. “Building insecurity in your encryption is irresponsible.”

Leave a Reply

Your email address will not be published. Required fields are marked *