Facebook refuses to break end-to-end encryption
Congress on Tuesday told Facebook and Apple that they better put backdoors into their end-to-end encryption, or they’ll pass laws that force tech companies to do so.
At a Senate Judiciary Committee hearing on Tuesday that was attended by Apple and Facebook representatives who testified about the worth of encryption that hasn’t been weakened, Sen. Linsey Graham had this to say:
You’re going to find a way to do this or we’re going to do this for you.
We’re not going to live in a world where a bunch of child abusers have a safe haven to practice their craft. Period. End of discussion.
It’s the latest shot fired in the ongoing war over encryption. The most recent salvos have been launched following the privacy manifesto that Facebook CEO Mark Zuckerberg published in March.
At the time, Zuckerberg framed the company’s new stance as a major strategy shift that involves developing a highly secure private communications platform based on Facebook’s Messenger, Instagram, and WhatsApp services.
Facebook’s plan is to leave the three chat services as standalone apps but to also stitch together their technical infrastructure so that users of each app can talk to each other more easily.
The plan also includes slathering the end-to-end encryption of WhatsApp – which keeps anyone, including Facebook itself, from reading the content of messages – onto Messenger and Instagram. At this point, Facebook Messenger supports end-to-end encryption in “secure connections” mode: a mode that’s off by default and has to be enabled for every chat. Instagram has no end-to-end encryption on its chats at all.
You had better end – or at least pause – your plan, three governments warned Facebook in October.
US Attorney General William Barr and law enforcement chiefs of the UK and Australia signed an open letter calling on Facebook to back off of its “encryption on everything” plan unless it figures out a way to give law enforcement officials backdoor access so they can read messages.
“No,” Facebook said – with all due respect to law enforcement and its need to keep people safe.
On Monday, Facebook released an open letter it penned in response to Barr.
In the letter, WhatsApp and Messenger heads Will Cathcart and Stan Chudnovsky said that any backdoor access into Facebook’s products created for law enforcement would weaken security and let in bad actors who would exploit the access. That’s why Facebook has no intention of complying with Barr’s request that the company make its products more accessible, they said:
The ‘backdoor’ access you are demanding for law enforcement would be a gift to criminals, hackers and repressive regimes, creating a way for them to enter our systems and leaving every person on our platforms more vulnerable to real-life harm.
People’s private messages would be less secure and the real winners would be anyone seeking to take advantage of that weakened security. That is not something we are prepared to do.
In his opening statement on Tuesday, Sen. Graham – the chairman of the Senate Judiciary Committee – told Apple and Facebook representatives that he appreciates “the fact that people cannot hack into my phone,” but encrypted devices and messaging create a “safe haven” for criminals and child exploitation.
In Facebook’s letter, Cathcart and Chudnovsky pointed out that cybersecurity experts have repeatedly shown that weakening any part of an encrypted system means that it’s weakened “for everyone, everywhere.” It’s impossible to create a backdoor just for law enforcement that others wouldn’t try to open, they said.
They’re not alone in that belief, they said. Over 100 organizations, including the Center for Democracy and Technology and Privacy International, responded to Barr’s letter to share their views on why creating backdoors jeopardizes people’s safety. Facebook’s letter also quoted Cryptography Professor Bruce Schneier from comments he made earlier this year:
You have to make a choice. Either everyone gets to spy, or no one gets to spy. You can’t have ‘We get to spy, you don’t.’ That’s not the way the tech works.
And as it is, Facebook is already working on making its platforms more secure, they said. It’s more than doubled the number of employees who are working on safety and security, and it’s using artificial intelligence (AI) to detect bad content before anyone even reports it or, sometimes, sees it. For its part, WhatsApp is detecting and banning two million accounts every month, based on abuse patterns. It also scans unencrypted information – such as profile and group information – looking for tell-tale content such as child abuse imagery.