Government Introduces Amended Online Safety Bill
Updated proposals will make UK safest place online with a number of stiff penalties and sanctions, the Government claims, but critics not convinced
The Government has on Thursday unveiled to Parliament its proposed Online Safety Bill (OSB), which has been subjected to multiple last minute amendments.
The government in its announcement said that “the Online Safety Bill marks a milestone in the fight for a new digital age which is safer for users and holds tech giants to account.”
The government claims the Bill will protect children from harmful content such as pornography and limit people’s exposure to illegal content, while protecting freedom of speech. Critics however argue the Bill may achieve the exact opposite.
OSB proposals
Since the first draft was published back in December, the OSB has been facing pre-legislative scrutiny by MPs and peers.
This has resulted in a number of additions and amendments to the Bill.
And what will be concerning for the likes of Facebook, Twitter and Google, is that a new legal duty in the Bill will require them to prevent paid-for fraudulent adverts appearing on their services.
Under a previous draft of the online safety bill, those platforms would have had a “duty of care” to protect users from fraud by other users.
The government says the change will improve protections for internet users from the potentially devastating impact of fake ads, including where criminals impersonate celebrities or companies to steal people’s personal data, peddle dodgy financial investments or break into bank accounts.
Jail sentences
The OSB will also require social media platforms, search engines and other apps and websites allowing people to post their own content to protect children, tackle illegal activity and uphold their stated terms and conditions.
The regulator Ofcom will have the power to fine companies failing to comply with the laws up to ten per cent of their annual global turnover, force them to improve their practices and block non-compliant sites.
And tech executives whose companies fail to cooperate with Ofcom’s information requests could now face prosecution or jail time within two months of the Bill becoming law, instead of two years as it was previously drafted.
A raft of other new offences have also been added to the Bill to make in-scope companies’ senior managers criminally liable for destroying evidence, failing to attend or providing false information in interviews with Ofcom, and for obstructing the regulator when it enters company offices.
“The internet has transformed our lives for the better,” said digital secretary Nadine Dorries. “It’s connected us and empowered us. But on the other side, tech firms haven’t been held to account when harm, abuse and criminal behaviour have run riot on their platforms. Instead they have been left to mark their own homework.”
“We don’t give it a second’s thought when we buckle our seat belts to protect ourselves when driving,” said Dorries. “Given all the risks online, it’s only sensible we ensure similar basic protections for the digital age. If we fail to act, we risk sacrificing the wellbeing and innocence of countless generations of children to the power of unchecked algorithms.”
“Since taking on the job I have listened to people in politics, wider society and industry and strengthened the Bill, so that we can achieve our central aim: to make the UK the safest place to go online,” said Dorries.
Industry reaction
Some within the tech industry have welcomed the government proposed Online Safety Bill.
“We all deserve greater protection when using online services – and the need for the Online Safety Bill signals just that,” said Amir Nooriala, chief commercial officer at Callsign. “We know it will focus on putting the onus on technology companies to make sure consumers are protected online, and most importantly, being transparent with the public about the measures being put in place to keep them safe.”
“We are well aware of the horror stories around victims being tricked into sharing passwords, sensitive information, and receiving online hatred from trolls, but the Online Safety Bill is a positive step towards building the trust back up between the tech industry and consumers,” said Nooriala.
“If the government truly wants to make the UK the safest place to go online, it needs to work with tech organisations to make sure they are valuing their relationship with the public,” said Nooriala. “The tech sector is responsible for putting the correct identity authentication methods in place to positively identify genuine users, so they can get on with their digital lives in peace, while making sure fraudsters aren’t given the attention they don’t deserve.”
People at risk
But other figures, such as Robin Wilton, Director of Internet Trust at the global nonprofit the Internet Society, argue the Online Safety Bill does the opposite of what it claims to achieve by sacrificing confidentiality online, making platforms liable for criminal activity, and undermining encryption.
The Internet Society is a global nonprofit committed to building an open, globally connected, secure, and trustworthy Internet for everyone.
“The Online Safety Bill will probably fail to keep children safe, and will put them and others at risk by sacrificing confidentiality,” noted Wilton. “Its goal seems to be to increase uncertainty for service providers, rather prosecutions for offences, or prevention of bad behaviour.”
“A service provider fixated on managing its own liability doesn’t mean better child safety,” said Wilton. “Nadine Dorries is simultaneously praising secure comms and access to reliable information in Ukraine, while seeking to take it away from her own citizens.”
Read also : FTX Co-Founder Gary Wang Spared Prison“‘Offensive content’ is too vague to legislate, and too contextual,” Wilton added. “Is an image that is yellow in the bottom half and blue in the top half, like a Ukrainian flag, harmless, or offensive and subversive?”
“The OSB still can’t define ‘harm’ clearly,” he added. “It also creates liability for things that haven’t even happened yet.”
And then Wilton pointed out that the OSB doesn’t meaningfully mention encryption, but it will have the effect of making service providers withdraw it, and over-censor.
“It makes platforms liable for the behaviour of their users,” said Wilton. “We wouldn’t expect supermarkets to be liable for knife crime by selling knives, so why would we expect the same for platforms? It will only encourage them to withdraw encrypted services and make their users less safe.”
“What children and other at-risk people need is a secure lifeline to someone they can trust; we shouldn’t be inserting third parties into those conversations,” said Wilton.
Legal viewpoint
And the threat of criminal offences for senior managers was noted by legal experts, including Ben Packer, partner at Linklaters.
“The Government are proposing that named senior managers at tech firms will be criminally liable if their company fails to comply with Ofcom’s supervisory or enforcement measures,” said Packer. “The primary changes to the previous draft bill are that these powers will come in after two months rather than two years and will be expanded in scope to cover other types of ‘information’ offences.”
“The devil will be in the detail but, to be workable, the final law will surely need to limit these offences to only cover information the individual could reasonably be expected to provide and to contain defences, for instance if the senior manager honestly believed that what they provided to Ofcom was not false,” said Packer.
“A major challenge facing Ofcom will be how to enforce these powers, given that many of the platforms in the scope of the regime will not have a physical presence or individual members of staff based in the UK,” he said.
Parker also noted that one of the major areas of ambiguity concerning the original draft of the Online Safety Bill was the definition of “legal but harmful” content.
“Today’s confirmation that the Government and Parliament – rather than each platform – will define what is meant by this phrase should remove some of this ambiguity, but will mean that the definition is more static and less able to adapt and respond to emerging areas of harm,” said Parker. “It will also mean that there is political involvement in defining the categories of content that the major platforms will have duties to assess and explain how they are being dealt with.”