Facebook’s policing efforts of its platform to prevent it being abused by foreign actors and fake accounts continues.
In its update to ‘enforcing our community standards’, Facebook revealed that over the past months it has disabled 3.4 billion fake account.
Earlier this month Facebook gave an update on its Russian account crackdown that peddle fake news, after it deleted a further 97 pages, groups and accounts.
Facebook at the time blamed two operations from Russia that were guilty of creating multiple accounts that were misleading others about who they were and what they were doing.
That action came after Facebook revealed last August it had taken down over 650 Facebook pages and groups that it said were linked to Russia.
And its action against fake accounts continues, when Facebook provided a fresh update via its third Community Standards Enforcement Report, about its recent policing efforts.
“For fake accounts, the amount of accounts we took action on increased due to automated attacks by bad actors who attempt to create large volumes of accounts at one time,” Facebook said. “We disabled 1.2 billion accounts in Q4 2018 and 2.19 billion in Q1 2019.”
And for the first time Facebook’s report provided data on how much content people appealed and how much content was restored after the platform had initially taken action.
Facebook also opened up about its crackdown on illicit sales of regulated goods, namely firearm and drug sales.
Facebook is taking action across nine categories including adult nudity and sexual activity; bullying and harassment; child nudity and sexual exploitation of children; fake accounts; hate speech; regulated goods; spam; global terrorist propaganda; and lastly violence and graphic content.
For example Facebook said that it estimates for every 10,000 times people viewed content on Facebook, 11 to 14 views contained content that violated its adult nudity and sexual activity policy.
For every 10,000 times people viewed content on Facebook, 25 views contained content that violated its violence and graphic content policy.
And it estimates that 5 percent of monthly active accounts are fake.
There is little doubt that Facebook is ramping up internal efforts to better police its platform.
Last month Facebook banned a number of groups and individuals associated with the far right for “spreading hate”.
The ban includes a number of well known groups and individuals such as the British National Party (BNP) and its former leader Nick Griffin, as well as the English Defence League and the National Front.
Landmark ruling finds NSO Group liable on hacking charges in US federal court, after Pegasus…
Microsoft reportedly adding internal and third-party AI models to enterprise 365 Copilot offering as it…
Albania to ban access to TikTok for one year after schoolboy stabbed to death, as…
Shipments of foldable smartphones show dramatic slowdown in world's biggest smartphone market amidst broader growth…
Google proposes modest remedies to restore search competition, while decrying government overreach and planning appeal
Sega 'evaluating' starting its own game subscription service, as on-demand business model makes headway in…