Home Latest News WhatsApp would not remove end-to-end encryption for UK law, says chief –...

WhatsApp would not remove end-to-end encryption for UK law, says chief – The Guardian

Meta’s head of chat app says it would not comply with the requirements set out in online safety bill
WhatsApp would refuse to comply with requirements in the online safety bill that attempted to outlaw end-to-end encryption, the chat app’s boss has said, casting the future of the service in the UK in doubt.
Speaking during a UK visit in which he will meet legislators to discuss the government’s flagship internet regulation, Will Cathcart, Meta’s head of WhatsApp, described the bill as the most concerning piece of legislation currently being discussed in the western world.
He said: “It’s a remarkable thing to think about. There isn’t a way to change it in just one part of the world. Some countries have chosen to block it: that’s the reality of shipping a secure product. We’ve recently been blocked in Iran, for example. But we’ve never seen a liberal democracy do that.
“The reality is, our users all around the world want security,” said Cathcart. “Ninety-eight per cent of our users are outside the UK. They do not want us to lower the security of the product, and just as a straightforward matter, it would be an odd choice for us to choose to lower the security of the product in a way that would affect those 98% of users.”
“End-to-end” encryption is used in messaging services to prevent anyone but the recipients of a communication from being able to decrypt it. WhatsApp cannot read messages sent over its own service, and so cannot comply with law enforcement requests to hand over messages, or pleas to actively monitor communications for child protection or antiterror purposes.
The UK government already has the power to demand the removal of encryption thanks to the 2016 investigatory powers act, but WhatsApp has never received a legal demand to do so, Cathcart said. The online safety bill is a concerning expansion of that power, because of the “grey area” in the legislation.
Under the bill, the government or Ofcom could require WhatsApp to apply content moderation policies that would be impossible to comply with without removing end-to-end encryption. If the company refused to do, it could face fines of up to 4% of its parent company Meta’s annual turnover – unless it pulled out of the UK market entirely.
Similar legislation in other jurisdictions, such as the EU’s digital markets act, explicitly defends end-to-end encryption for messaging services, Cathcart said, and he called for similar language to be inserted into the UK bill before it passed. “It could make clear that privacy and security should be considered in the framework. It could explicitly say that end-to-end encryption should not be taken away. There can be more procedural safeguards so that this can’t just happen independently as a decision.”
Although WhatsApp is best known as a messaging app, the company also offers social networking-style features through its “communities” offering, which allows group chats of more than a 1,000 users to be grouped together to mimic services such as Slack and Discord. Those, too, are end-to-end encrypted, but Cathcart argued that the chances of a large community causing trouble was slim. “When you get into a group of that size, the ease of one person reporting it is very high, to the extent that if there’s actually something serious going on it is very easy for one person to report it, or easy if someone is investigating it for them to get access.”
Sign up to TechScape
Alex Hern’s weekly dive in to how technology is shaping our lives
after newsletter promotion
The company also officially requires UK users to be older than 16, but Cathcart declined to advise parents whose children have an account on the service to delete it, saying “it’s important that parents make thoughtful choices”.
The online safety bill is expected to return to parliament this summer. If passed, it will give Ofcom significant new powers as the internet regulator, and enable it to require effective content moderation under the penalty of large fines.