Wikipedia has revealed it will begin using an artificial intelligence (AI) tool to help Wiki editors detect “bad edits” to articles.

The tool is designed to help editors maintain the quality of Wikipedia by ‘triaging’ potentially suspect edits from the torrent of new edits it recieves on a daily basis.

X-ray Specs

Wikipedia said the new tool, known as the Objective Revision Evaluation Service (ORES), will provide Wiki editors with ‘X-ray specs’ to allow them to quickly hone in on articles that have received bad edits.

“This service empowers Wikipedia editors by helping them discover damaging edits and can be used to immediately ‘score’ the quality of any Wikipedia article,” said Wikipedia in a blog post. “We’ve made this artificial intelligence available as an open web service that anyone can use.”

Wikipedia revealed it receives approximately half a million edits per day.

“In order to maintain the quality of Wikipedia, this firehose of new content needs to be constantly reviewed by Wikipedians,” it said. “The Objective Revision Evaluation Service (ORES) functions like a pair of X-ray specs, the toy hyped in novelty shops and the back of comic books – but these specs actually work to highlight potentially damaging edits for editors. This allows editors to triage them from the torrent of new edits and review them with increased scrutiny.”

“We’ve been testing the service for a few months and more than a dozen editing tools and services are already using it,” it added. “We’re beating the state of the art in the accuracy of our predictions.  The service is online right now and it is ready for your experimentation.”

The ORES tool has been trained by Wikipedia editors to recognise the quality of an edit based on the language and context of the change. It will assess whether the edit content is “damaging”.

Past Controversies

It remains to be seen how the ORES tool will be received by Wikipedia’s army of editors, especially given the concerns of some high ranking scientists about the dangers of artificial intelligence.

But others have pointed to the advantages to be gained from the new technology.

In August scientists from Arizona State University said they believe they can use artificial intelligence to learn more about the military strategy of Islamic State. The researchers used AI algorithms to study patterns and behaviour of IS extremists, and said they could be of “significant” help for both the US military and policymakers in the future.

Whatever happens, it is not as if the community-written encyclopaedia has not faced controversy in the past. In October 2013 for example, Wikipedia suspended more than 250 editing accounts over concerns surrounding sponsored editing.

A similar storm broke in 2012, when editors were caught promoting certain pages for cash. Trustee  Roger Bamkin was seen placing content on the main Wikipedia page for money, whilst “Wikipedian In Residence” Max Klein set up a service for editing of the site called UntrikiWiki.

In June analyst house Gartner warned that artificial intelligence could assimilate the role of CIOs.

Are you fluent in the language of the Internet? Take our quiz!

Tom Jowitt

Tom Jowitt is a leading British tech freelancer and long standing contributor to Silicon UK. He is also a bit of a Lord of the Rings nut...

Recent Posts

UK’s CMA Readies Cloud Sector “Behavioural” Remedies – Report

Targetting AWS, Microsoft? British competition regulator soon to announce “behavioural” remedies for cloud sector

8 hours ago

Former Policy Boss At X Nick Pickles, Joins Sam Altman Venture

Move to Elon Musk rival. Former senior executive at X joins Sam Altman's venture formerly…

10 hours ago

Bitcoin Rises Above $96,000 Amid Trump Optimism

Bitcoin price rises towards $100,000, amid investor optimism of friendlier US regulatory landscape under Donald…

12 hours ago

FTX Co-Founder Gary Wang Spared Prison

Judge Kaplan praises former FTX CTO Gary Wang for his co-operation against Sam Bankman-Fried during…

12 hours ago