Ofcom has warned that “robust” age verification checks on all websites containing online pornography, must be in place by July 2025 at the latest.
Ofcom is the communications regulator in charge of enforcing the UK’s controversial Online Safety Act, and on Thursday published its guidance on “effective age checks to prevent children from encountering online pornography and other harmful content.”
It was in December 2023 when Ofcom had published its proposals for acceptable age-verification methods for porn websites, so as to comply with the Online Safety Act that had received Royal Assent in October 2023.
The passing of the controversial Online Safety Bill had been a bitter blow to privacy campaigners and big name tech firms, which had consistently opposed the British legislation.
Tech firms had been alarmed at the provisions in the bill for creating an encryption backdoor.
In March 2023 WhatsApp and Signal had warned they would rather pull out of the UK than comply with the act’s requirements.
Companies could be fined up to 10 percent of their global turnover if they fail to adhere to the regulations, or Ofcom could potentially block the websites from being accessible in the UK.
Executives of offending websites could also be jailed.
But another controversial aspect of the Online Safety Act was the requirement for age verification to access online pornographic websites.
Ofcom has cited research that children are being exposed to online pornography from an early age. Of those who have seen online pornography, the average age they first encounter it is aged 13 – although more than a quarter come across it by age 11 (27 percent), and one in ten as young as 9 (10 percent).
Ofcom had proposed that acceptable age checking or verification include banking verification to ensure the user is over 18; photo ID checking; facial age estimation; mobile network operator age checking; credit card checks; and digital identity wallet checks.
Now Ofcom has stated ‘robust’ age checks are a cornerstone of the Online Safety Act.
It requires services which allow pornography or certain other types of harmful content to introduce ‘age assurance’ to ensure that children are not normally able to encounter it. Age assurance methods – which include age verification, age estimation or a combination of both – must be ‘highly effective’ at correctly determining whether a particular user is a child.
Ofcom said its approach was designed to flexible, tech-neutral and future-proof, in order to protect children and “ensure that privacy rights are protected and that adults can still access legal pornography.”
The regulator said that “as platforms take action to introduce age assurance over the next six months, adults will start to notice changes in how they access certain online services. Our evidence suggests that the vast majority of adults (80 percent) are broadly supportive of age assurance measures to prevent children from encountering online pornography.”
Ofcom stated it will “contact a range of adult services – large and small – to advise them of their new obligations. We will not hesitate to take action and launch investigations against services that do not engage or ultimately comply.”
“For too long, many online services which allow porn and other harmful material have ignored the fact that children are accessing their services,” said Melanie Dawes, Ofcom’s chief executive. “Either they don’t ask or, when they do, the checks are minimal and easy to avoid. That means companies have effectively been treating all users as if they’re adults, leaving children potentially exposed to porn and other types of harmful content. Today, this starts to change.”
“As age checks start to roll out in the coming months, adults will start to notice a difference in how they access certain online services,” said Dawes. “Services which host their own pornography must start to introduce age checks immediately, while other user-to-user services – including social media – which allow pornography and certain other types of content harmful to children will have to follow suit by July at the latest.”
“We’ll be monitoring the response from industry closely,” said Dawes. “Those companies that fail to meet these new requirements can expect to face enforcement action from Ofcom.”
It should be remembered that it was the government of David Cameron , which as far back as 2015 had promised age verification checks in an effort to stop children from accessing online porn.
That was despite the fact that there was already opt-out ISP porn filters introduced by the government in 2013, which was intended to help households control access to adult material, but unintentionally blocked educational resources such as sexual health websites.
The government tried again to implement porn age checking, and it was supposed to have been implemented in April 2018.
But again the government delayed it, despite porn website owners preparing for the legal checks.
In 2018 for example the owner of porn websites including PornHub had revealed its online age verification tool call AgeID, that it would use to verify the age of people seeking online smut.
BT throws in the towel to install 60,000 EV chargers utilising roadside cabinets, after installing…
In its final few days, Biden Administration delivers another executive order focused on bolstering cybersecurity…
Elon Musk's backing of a far right political party in Germany, results in two government…
Heavy payload rocket New Glenn successfully blasts off into orbit on Thursday, signalling increased competition…
Executive order for AI signed by President Biden, to reduce bottlenecks to connect AI data…
Ayming research finds British tech firms are struggling to source suitable funding and skills –…