Friday, December 4, 2015

Wikipedia, off to artificial intelligence. To save the quality and involve multiple users – The Republic

Wikipedia is looking for a future. It relies on artificial intelligence to groped to garantirselo. Users who contribute to the compilation of the largest online encyclopedia, paradigmatic model of shared realization, are in fact declining. At least in the English language, which in eight years has lost 40% of authors active, stabilizing at around 30 thousand signatures. Elite signing the widespread knowledge of the planet. Become a new editor was often complicated by frequent changes are not accepted because the tools semiautomatic that oversee the process of acceptance and publication are extremely rigid and discard the intervention has been uploaded, even if they contain minimal and negligible. In short, will also be user friendly to millions of users (it is firmly in the top ten sites visited in the world, he has effectively killed the market of traditional knowledge, but also in terms of visits came signs of decline), but given the vantage point of those who would like to help, correct, add, specify items, Wikipedia is a nice puzzle.

That’s why the Wikimedia Foundation, the nonprofit organization based in San Francisco that supports the activities of Wikipedia, the He is trying all. One, in particular, seems the way to bet: that, in fact, artificial intelligence. The solution was found in software – dubbed Revision Objective Evaluation Service , better known as Ores – are used to distinguish errors in good faith, made perhaps by new would-be collaborators, vandals content. Taking into modified from the first, maybe inviting them to review them in a less drastic than is the case now, and cleaning up mercilessly than charged by seconds.

The project was developed by Aaron Halfaker , a researcher working for the foundation in California, and therefore aims to not scare those who want to get closer to a job (voluntary, of course) the most active on the platform. The system uses a set of algorithms open source known as SciKit Learn to fathom damaging changes and separate from the changes produced by the well-meaning, even if they contain inaccuracies. Thus achieving a dual result: clean up the most consulted encyclopedia of the planet by the many imperfections and risks of hoaxes and complete falsehood and, on the other hand, groped to reverse the decline of the editor, giving way to a new era of cooperation.

Ores, like all systems machine learning – on a par with those used and developed by other giants like Google or Facebook – can learn and refine over time, classifying the quality of modified and then establishing those produced in good faith or not. Even on Wikipedia, in fact, the factory tarot never closes: there are companies and companies that exploit editor mercenaries ready to step on the items indicated to force the content and bend them according to the wishes of clients. Paradoxically, however, too many rules set by the mechanism seem to weigh it down too, pushing even the new energy. “The aggressive attitude that Wikipedia users to control the edits from the fact that they are encouraged to have a human interaction with people,” said Halaker the MIT Technology Review. The system will also deal with this: make relations more linear and less traumatic. Ores will work initially in versions in English, Portuguese, turkish and be of Wikipedia. In short, in order to appreciate the results in Italian will have to wait a bit ‘.

Arguments:
Wikipedia
Wikipedians
AI
intelligence
buffaloes
Web hoaxes
San Francisco
social
Web encyclopedia
Stakeholders :
Aaron Halfaker
LikeTweet

No comments:

Post a Comment