Categories: RegulationWorkspace

Porn Filters Are Not The Protection Children Need

Web filters designed to block Internet porn have become a political pawn. They are promoted heavily and supported by politicians and service providers – but with very different goals.

In 2012, some MPs called for porn blocking to be turned on by default, so any user wanting “adult content” would have to opt in to see it. The government stopped short of this, but during consultations, seemed to keep changing its mind, apparently feeling the need to be seen to be “doing something”.

Leaked porn hypocrisy

This week, the BBC published a leaked memo in which the Department of Education, pretty explicitly, asked ISPs to lie about their porn blocking activity, so the Prime Minister can look good.

With some MPs demanding default-on porn blocking, ISPs want to offer “active choice” in which users choose to have porn blocking activated. After lengthy debate, the DoE memo added: “Without changing what you will be offering (ie active-choice +), the prime minister would like to be able to refer to your solutions are ‘default-on’ as people will have to make a choice not to have the filters (by unticking the box).”

The row has grown since then. MP Claire Perry seems to believe Internet porn and other content can be turned off by a simple switch, and if we don’t do this, children will stumble across it by accident. ISPs are going along with offering the filter (albeit ensuring it’s not on by default), but make no big claims about – effectively hoping that the whole thing will blow over.

But the whole debate misses the point. Porn filters simply do not work, and they are actually addressing the wrong problem. Porn filters are not designed to protect children from “stumbling on” material. And in any case, they are pretty ineffectual at the job they do set out to do.

Chris Puttick, CEO of Internet protection firm TwoTen puts it like this: “The original filtering techniques were developed in the university sector, to stop students and adults accessing material that the provider of the connection didn’t want them to – not to protect them against an accidental encounter.”

“Adult material” is something children will actually look for, when they reach a certain age (certainly by the time they are teens), and so-called porn blocks can do little to stop this. “This is stuff that children from a certain age actively seek,” he told us. “They are not wandering around and finding it accidentally, that’s not how it happens.”

Content on the Internet is not pumped like water – so it comes out if the taps are open – he said: “The Internet is something where you request material.” As far as filtering does anything, he says, it’s for limiting the activity of teens.

But even if web blocking was an actual possibility, it still wouldn’t do the job Perry and her colleagues want it to – protecting children. There are a couple of reasons for this, says Puttick. First, adult content is not “monocultural”. Some people might allow their children to watch violence at an earlier age than others, for instance.

More importantly, it is very hard to legislate for are things which are not “age appropriate”: “There are discussions on Facebook and videos on Youtube, which are not obscene, but parents might judge them as not ‘age appropriate’ for a five year old.”

Active filtering

Not surprisingly, Puttick has an answer. His firm, TwoTen gives children a service that connects to any browser, that only displays the sites and pages which have been passed for the profile their parents have selected. All the pages they see have been classified by human beings, working for TwoTen, using roughly the same categories as film classification – “U”, “PG” and so on.

At first glance this is absurd. Given the billions of pages on the Internet, how can one company expect to actually classify the whole Internet? It can’t hire enough people, and automatic classification won’t work.

He gives the example of the film, The Boy In The Striped Pyjamas. “There’s no nudity, no violence on screen, no sex, and no bad language,” he said – and yet it has a 12A certificate, because of “concepts” like concentration camps and the Holocaust.

Filters and automatic systems don’t work on this sort of thing at all,” he told us. “you can work out a series of search algorithms that spot porn. but you can’t make a machine that says a film should be classified as a U.”

So what happens? TwoTen uses classifiers who are guided by what children actually ask to see. And the important thing is, it blocks everything that hasn’t been classified. Everything is blocked by default, until children have asked to see it, by clicking a link. If no-one has asked of it yet, the child can’t see it. If enough children ask for it, TwoTen checks it out.

“We use a voting system. If ten children ask for something in one hour, within another hour it is classified, so it is available if the policies set by parents allow it.” And it is really granular, so it gives rating to individual pieces of content, not to whole sites (obviously you can’t treat Youtube as a single entity, for instance).

When a child asks for a page that is not yet classified, they are presented with a “Sorry But…” page in which the company’s mascot, a duck called Peepus, tells them the page isn’t available right now and suggests other things they might want to look at.

That approach would not work with an adult or even a teenager engaged in legitimate tasks like looking for material for a school assignment. But for young children who to go to the Internet to play, it fits with the rest of their experience of the world, Puttick told us. “When they see a weird green animated duck, it doesn’t bother them – they see it as normal. We couldn’t see any other way of doing it,” he said.

TwoTen charges a monthly subscription of £5 per household, which at the moment uses client software on devices such as PCs, tablets, notebooks, and Internet TVs. It’s got a “school-level” offering called TwoTenNext, and plans to offer actual hardware – the TwoTen Cube based on a Raspberry Pi, which provides a child-friendly Wi-Fi service attached to any commercial broadband router, and a TwoTen SIM, which will be a replacement for a service provider’s mobile phone SIM card.

It’s got a number of employees classifying pages, paid for from TwoTen’s initial funding – but Puttick doesn’t give details on the funding or the numbers of staff, simply saying the service is available to try out, so people can get an idea of whether it does work.

At the moment, it does rely on children being young and not  determined to get over the walls, he concedes: “It would suit a child who doesn’t know how to disable the service. Once they are old enough, there’s not a lot we can do.” At that age, the child can simply

The name TwoTen refers to this limit – the service is designed for children aged between two and ten.

Looking for the site, we stumbled across an awkward fact. Like many other innocuous phrases, “two-ten” has another meaning, provided by (and perhaps created by) the Urban Dictionary. There’s also one for his e-duck, Peepus. Puttick is well awar of that – and the fact that we found the link simply backs up his argument on the futility of filters.

“Both Urban dictionary results still turn up when using Google Safe Search – but I can’t imagine many parents considering that as suitable for their kids,” he said. “It raises the question of how safe is safe search? I’m talking to Google about it, but when I first submitted them for exclusion from safe search results the (semi-automated) response was to deny the request. But don’t parents assume safe search means kid friendly?”

In TwoTen’s world, if children did come across links to the Urban Dictionary, they’d get sent away. If enough of them asked to get in, someone on TwoTen’s staff would have to check out each page they asked for, which they might find tiresome or amusing. But it would answer the need of many parents, to actually know what material their children can see.

Should this quiz be blocked? Try our Internet quiz!

Peter Judge

Peter Judge has been involved with tech B2B publishing in the UK for many years, working at Ziff-Davis, ZDNet, IDG and Reed. His main interests are networking security, mobility and cloud

Recent Posts

X’s Community Notes Fails To Stem US Election Misinformation – Report

Hate speech non-profit that defeated Elon Musk's lawsuit, warns X's Community Notes is failing to…

1 day ago

Google Fined More Than World’s GDP By Russia

Good luck. Russia demands Google pay a fine worth more than the world's total GDP,…

1 day ago

Spotify, Paramount Sign Up To Use Google Cloud ARM Chips

Google Cloud signs up Spotify, Paramount Global as early customers of its first ARM-based cloud…

2 days ago

Meta Warns Of Accelerating AI Infrastructure Costs

Facebook parent Meta warns of 'significant acceleration' in expenditures on AI infrastructure as revenue, profits…

2 days ago

AI Helps Boost Microsoft Cloud Revenues By 33 Percent

Microsoft says Azure cloud revenues up 33 percent for September quarter as capital expenditures surge…

2 days ago