Facebook is fairly vigilant in tracking and killing illicit activity on its site. It has done a fair amount of work to protect children too, the most notable move being to include the so-called “panic button” that lets youngsters report abuse to the UK Child Exploitation and Online Protection Centre (Ceop).
I’ve just learnt of another way that Facebook attempts to counter paedophiles on the site: it monitors any spikes in friendship requests between those older than 16 and those younger. When Facebook thinks you’re befriending too many minors it can block you.
This can, unsurprisingly, backfire and that is what happened to a charity worker, we’ve come to understand. A TechWeekEurope source who worked at a government-backed charity that works directly with young people, revealed one of its workers was temporarily barred from Facebook. You can guess why: they made friends with lots of young people.
It’s hard to criticise Facebook too heavily for blocking the charity worker, but it does leave you wondering why a company that can afford to spend $1 billion on a small firm like Instagram doesn’t invest in more intelligent, innovative technologies to combat egregious acts on its site. It would save itself from both embarrassment and customer ire.
Here is what I would suggest: use more linguistic and behavioural software. Businesses can train software using learning algorithms to figure out typical linguistic attributes of certain users. If Facebook leveraged this, picking up on typical grooming language, it wouldn’t have to rely on numbers alone to determine when something needs investigating.
Then Facebook could combine this with behavioural software. The Economist recently noted how consultancy Ernst & Young was offering kit it claimed could monitor an employee’s emotional state over time. It could notify the business when a worker was being “secretive”. It’s not hard to see how this could translate into Facebook’s world.
And it doesn’t have to be intrusive. Let the software do the work, do not allow any Facebook employees to view user messages or posts. Just use the data and graphical representations to see where potential problems lie. Facebook already has enough privacy problems as it is.
In those cases where Facebook is certain something illegal is going on, it may even be advisable to bust open users’ accounts. Obviously, Zuckerberg and Co would have to tread very carefully here.
But what is evident is that users really should not be blocked based on one metric. Facebook should seek a more rounded picture of users it suspects of wrongdoing. By combining the data it collects on friend requests with behavioural and linguistic software, it can do that. That’s not to say that there won’t be any more cases like the one involving our charity worker source, but there won’t be as much needless blocking of innocent users.
There’s an altruistic, corporate social responsibility opportunity for the company here too: Facebook could provide more valuable information to law enforcement. It is well known that the social network already works with police to track down criminals, as we saw when two men were jailed for four years for using Facebook to incite riots. In that case, the punishment was, to many, very harsh.
It’s unlikely those same people would feel aggrieved if Facebook was helping track down and jail paedophiles by using the aforementioned technology. If Facebook could avoid any privacy infringements in doing so, it would be a win-win situation.
How well do you know the languages of the internet? Test yourself with our quiz!
Suspended prison sentence for Craig Wright for “flagrant breach” of court order, after his false…
Cash-strapped south American country agrees to sell or discontinue its national Bitcoin wallet after signing…
Google's change will allow advertisers to track customers' digital “fingerprints”, but UK data protection watchdog…
Welcome to Silicon In Focus Podcast: Tech in 2025! Join Steven Webb, UK Chief Technology…
European Commission publishes preliminary instructions to Apple on how to open up iOS to rivals,…
San Francisco jury finds Nima Momeni guilty of second-degree murder of Cash App founder Bob…