Tuesday, December 23, 2014

Facebook: “Reported posts false.” The anti-clicking buffaloes that divides the web – The Republic

WHO does not would rather avoid all sorts of hoaxes circulating on Facebook, so many as to force the 2013 report of the World Economic Forum to conclude that it is the “misinformation digital mass” to be “one of the main risks to modern society” ? No one should have thought Mark Zuckerberg and his associates. And so, a few days, among the options to motivate reporting an unwanted post if they have added, quietly, one that allows you to hide the “false news”. The expression is both precise and vague, and the social network it provides examples not decisive, “a deliberately false news,” it said, or “a scam exposed by a reliable source.” Not exactly t he same, and in any case it remains to understand what it means to “unmasked”, and what “a reliable source”; above all, those who evaluate and what criteria.

Speaking to the Republic, a Facebook spokesman Italy tells her to remove the “false news” is one option being tested; in other words, one of the infinite test that the platform makes about its users, and that the general public began to learn with the experiment, disclosed last June and much discussed, on the emotions of 700 thousand subscribers. “Report stories that you do not want to see,” writes Facebook Italy, “helps the News Feed to do a better job to show the most relevant content in the future.”

The ratio, in short, is the same as always: to maximize exposure to what we like, and minimize that to what we do not like; only in this way, users will increase time spent on the site and interactions, with the hope – this is the goal – that sooner or later to click on advertising content, or at least produce useful information to create new more targeted and effective. While waiting to find out from the data scientists of Menlo Park as you reacted members, Facebook would point to not be operating any judgment about the reported content.

But it seems a contradiction in terms, when there are half of the true and the false. Above all, the position justifies a concern, then that is the same every time you talk about reporting content on Facebook: and if organized groups of users, even for purposes of political propaganda, they decide to report in a mass content true, but uncomfortable, to pass it off as false and therefore make it less visible to the limit to make it disappear? The position of Facebook is that the messages “are handled by a dedicated team consists of qualified experts in the field of security.” But equates to acquire a handful of factchecker, journalists who scandaglino well the veracity of the reported content? Why, and is the implication potentially positive, this is all about: a gradual introduction and experimental verification of a kind from the bottom of the goodness of the news circulating on the social network, in a distributed and collaborative.

Unlike other attempts, the critical mass there, suggests Sergio Maistrello, author of “Fact-checking, from journalism to the network” (Apogee). It could give interesting developments, such as the ability to recognize and highlight the most famous hoaxes before publication. “However, as would be treated the contents deliberately false, but satirical?” Asks Maistrello. Moreover, verification of the news “most of the time is a matter of shades”. Walter Quattrociocchi, Institute for Advanced Studies at the University of Lucca, agrees on the basis of the studies that led right on the dissemination and use of misinformation on Facebook. One issue, precise scholar, “not so much on what is true and what is not, what about the clash between partisan different visions of the world.” And in a context in which conspiracy and conspiracy theorists do not dialogue, this one of the experimental results, the new feature of Facebook “wi ll lead to greater bitterness between factions, not doing that reinforce the beliefs of those who are accused of believing in false information, and therefore their dissemination and use. ” Yet another example of how a subtle change to the platform could lead to social consequences anything but secondary.

LikeTweet

No comments:

Post a Comment