The UK Government is facing a growing revolt from big name messaging platforms, against its revised Online Safety Bill.
Will Cathcart, Meta’s head of WhatsApp, during a visit to the UK described the Online Safety Bill as the most concerning piece of legislation currently being discussed in the western world, and that WhatApp would refuse to remove end-to-end encryption if it became law.
The WhatsApp intervention on the Online Safety bill is the second time a messaging platform has signalled its opposition. Rival messaging app Signal recently said it could stop providing services in the UK if the bill required it to scan messages.
Meta’s WhatApp however is much bigger than Signal.
Indeed according to Ofcom WhatsApp is the most popular messaging platform in the UK, used by more than seven in 10 online adults.
Now the chat app’s boss has said that WhatsApp would refuse to comply with requirements in the online safety bill that attempted to outlaw end-to-end encryption.
Will Cathcart made the comments when he was speaking during a UK visit in which he will meet legislators to discuss the government’s flagship internet regulation, the Guardian newspaper reported.
“It’s a remarkable thing to think about. There isn’t a way to change it in just one part of the world,” Cathcart was quoted as saying. “Some countries have chosen to block it: that’s the reality of shipping a secure product. We’ve recently been blocked in Iran, for example. But we’ve never seen a liberal democracy do that.”
“The reality is, our users all around the world want security,” Cathcart added. “Ninety-eight per cent of our users are outside the UK. They do not want us to lower the security of the product, and just as a straightforward matter, it would be an odd choice for us to choose to lower the security of the product in a way that would affect those 98 percent of users.”
WhatsApp switched on its end-to-end encryption back in 2016 and Cathcart made clear it would not be removed to suit the UK legislation.
“End-to-end” encryption is used in messaging services to prevent anyone but the recipients of a communication from being able to decrypt it, the Guardian noted. WhatsApp cannot read messages sent over its own service, and so cannot comply with law enforcement requests to hand over messages, or pleas to actively monitor communications for child protection or antiterror purposes.
The UK government already has the power to demand the removal of encryption thanks to the 2016 investigatory powers act, but WhatsApp has never received a legal demand to do so, Cathcart reportedly said. The online safety bill is a concerning expansion of that power, because of the “grey area” in the legislation.
The Guardian noted that under the bill, the government or Ofcom could require WhatsApp to apply content moderation policies that would be impossible to comply with without removing end-to-end encryption.
If the company refused to do, it could face fines of up to 4 percent of its parent company Meta’s annual turnover – unless it pulled out of the UK market entirely.
Similar legislation in other jurisdictions, such as the EU’s digital markets act, explicitly defends end-to-end encryption for messaging services, Cathcart reportedly said, and he called for similar language to be inserted into the UK bill before it passed.
“It could make clear that privacy and security should be considered in the framework. It could explicitly say that end-to-end encryption should not be taken away,” said Cathcart. “There can be more procedural safeguards so that this can’t just happen independently as a decision.”
The Guardian noted that although WhatsApp is best known as a messaging app, the company also offers social networking-style features through its “communities” offering, which allows group chats of more than a 1,000 users to be grouped together to mimic services such as Slack and Discord. Those, too, are end-to-end encrypted, but Cathcart argued that the chances of a large community causing trouble was slim.
“When you get into a group of that size, the ease of one person reporting it is very high, to the extent that if there’s actually something serious going on it is very easy for one person to report it, or easy if someone is investigating it for them to get access.”
The company also officially requires UK users to be older than 16, but Cathcart declined to advise parents whose children have an account on the service to delete it, saying “it’s important that parents make thoughtful choices”.
The UK government is expected to return the online safety bill to parliament in the summer.
Landmark ruling finds NSO Group liable on hacking charges in US federal court, after Pegasus…
Microsoft reportedly adding internal and third-party AI models to enterprise 365 Copilot offering as it…
Albania to ban access to TikTok for one year after schoolboy stabbed to death, as…
Shipments of foldable smartphones show dramatic slowdown in world's biggest smartphone market amidst broader growth…
Google proposes modest remedies to restore search competition, while decrying government overreach and planning appeal
Sega 'evaluating' starting its own game subscription service, as on-demand business model makes headway in…