The UK Government is facing a growing revolt from big name messaging platforms, against its revised Online Safety Bill.

Will Cathcart, Meta’s head of WhatsApp, during a visit to the UK described the Online Safety Bill as the most concerning piece of legislation currently being discussed in the western world, and that WhatApp would refuse to remove end-to-end encryption if it became law.

The WhatsApp intervention on the Online Safety bill is the second time a messaging platform has signalled its opposition. Rival messaging app Signal recently said it could stop providing services in the UK if the bill required it to scan messages.

UK exit?

Meta’s WhatApp however is much bigger than Signal.

Indeed according to Ofcom WhatsApp is the most popular messaging platform in the UK, used by more than seven in 10 online adults.

Now the chat app’s boss has said that WhatsApp would refuse to comply with requirements in the online safety bill that attempted to outlaw end-to-end encryption.

Will Cathcart made the comments when he was speaking during a UK visit in which he will meet legislators to discuss the government’s flagship internet regulation, the Guardian newspaper reported.

“It’s a remarkable thing to think about. There isn’t a way to change it in just one part of the world,” Cathcart was quoted as saying. “Some countries have chosen to block it: that’s the reality of shipping a secure product. We’ve recently been blocked in Iran, for example. But we’ve never seen a liberal democracy do that.”

“The reality is, our users all around the world want security,” Cathcart added. “Ninety-eight per cent of our users are outside the UK. They do not want us to lower the security of the product, and just as a straightforward matter, it would be an odd choice for us to choose to lower the security of the product in a way that would affect those 98 percent of users.”

End-to-end encryption

WhatsApp switched on its end-to-end encryption back in 2016 and Cathcart made clear it would not be removed to suit the UK legislation.

“End-to-end” encryption is used in messaging services to prevent anyone but the recipients of a communication from being able to decrypt it, the Guardian noted. WhatsApp cannot read messages sent over its own service, and so cannot comply with law enforcement requests to hand over messages, or pleas to actively monitor communications for child protection or antiterror purposes.

The UK government already has the power to demand the removal of encryption thanks to the 2016 investigatory powers act, but WhatsApp has never received a legal demand to do so, Cathcart reportedly said. The online safety bill is a concerning expansion of that power, because of the “grey area” in the legislation.

The Guardian noted that under the bill, the government or Ofcom could require WhatsApp to apply content moderation policies that would be impossible to comply with without removing end-to-end encryption.

If the company refused to do, it could face fines of up to 4 percent of its parent company Meta’s annual turnover – unless it pulled out of the UK market entirely.

EC protections

Similar legislation in other jurisdictions, such as the EU’s digital markets act, explicitly defends end-to-end encryption for messaging services, Cathcart reportedly said, and he called for similar language to be inserted into the UK bill before it passed.

“It could make clear that privacy and security should be considered in the framework. It could explicitly say that end-to-end encryption should not be taken away,” said Cathcart. “There can be more procedural safeguards so that this can’t just happen independently as a decision.”

The Guardian noted that although WhatsApp is best known as a messaging app, the company also offers social networking-style features through its “communities” offering, which allows group chats of more than a 1,000 users to be grouped together to mimic services such as Slack and Discord. Those, too, are end-to-end encrypted, but Cathcart argued that the chances of a large community causing trouble was slim.

“When you get into a group of that size, the ease of one person reporting it is very high, to the extent that if there’s actually something serious going on it is very easy for one person to report it, or easy if someone is investigating it for them to get access.”

The company also officially requires UK users to be older than 16, but Cathcart declined to advise parents whose children have an account on the service to delete it, saying “it’s important that parents make thoughtful choices”.

The UK government is expected to return the online safety bill to parliament in the summer.

Tom Jowitt

Tom Jowitt is a leading British tech freelancer and long standing contributor to Silicon UK. He is also a bit of a Lord of the Rings nut...

Recent Posts

Apple, Google Mobile Ecosystems Should Be Investigated, CMA Told

CMA receives 'provisional recommendation' from independent inquiry that Apple,Google mobile ecosystem needs investigation

2 days ago

Australia Rejects Elon Musk Claim About Social Media Ban For Under-16s

Government minister flatly rejects Elon Musk's “unsurprising” allegation that Australian government seeks control of Internet…

2 days ago

Northvolt Files For Bankruptcy Protection In US

Northvolt files for Chapter 11 bankruptcy protection in the United States, and CEO and co-founder…

2 days ago

UK’s CMA Readies Cloud Sector “Behavioural” Remedies – Report

Targetting AWS, Microsoft? British competition regulator soon to announce “behavioural” remedies for cloud sector

3 days ago

Former Policy Boss At X, Nick Pickles, Joins Sam Altman Venture

Move to Elon Musk rival. Former senior executive at X joins Sam Altman's venture formerly…

3 days ago

Bitcoin Rises Above $96,000 Amid Trump Optimism

Bitcoin price rises towards $100,000, amid investor optimism of friendlier US regulatory landscape under Donald…

3 days ago