In 1906 the scientist Francis Galton He made a singular discovery: by joining the responses of a group of people who had answered independently, the conclusion was surprisingly valid. The experiment was one of the triggers of the famous book ‘The Wisdom of Crowds’ (titled in Spanish “One hundred best one”) of James Surowiecki, and validated this theory in which often the collective opinion on certain issues was to be correct.
Now that wisdom or collective intelligence will be part of Facebook. The company, widely criticized by the controversial false news during the last elections in the United States, has launched already a plan to combat the problem. One key to that plan is using you to you as part of this intelligent mass that help to detect false news. That, says recent history, may not be such a good idea.
The wisdom of the masses work… sometimes
Galton experiment was to estimate the weight of a slaughtered ox, so mechanics was very appropriate: each of the 800 respondents gave their estimate, and at the end the median of all those numbers turned out to be surprisingly close to the real weight of the animal. He erred only in 0.8%.
It would be great to be able to know how the scientist Francis Galton would respond to the problems that currently occur on sites like Reddit or Digg.
The problem with this experiment is that although it works well for technical results, do so as well for those issues where there is a such objective conclusion. When you have to evaluate the quality or truth of a matter, the thing gets tough. It is something of what those responsible for the creation of such as Reddit or Digg news aggregators know a lot in its first edition, but it also offers amazing results in services that users qualify an object.
The perfect example is Amazon: a researcher at the University of Dartmouth, Mikhail Gronas, revealed a strange and surprising pattern in certain qualifications of the books from Amazon. Had issues that were best secured ratings (the saga of ‘Harry Potter’ books) and others who had them worse, but there were others with a curious curve: had many a Star scores, and many five-star, something that generated a horseshoe-shaped curve: were the controversial books that generated so much criticism as praise but who achieved something unique : very high sales.
Voting and prejudices
The problem with this system is that when a reader is going to buy it and see scores & ratings does not act independently, but it does so with a bias, either positive or negative. That’s already reason enough for not complying with the premises of the experiment that Francis Galton had done a century before.
The operation of Digg in their first season, when all the users voting news, Digg me today and above all of Reddit make good use of that same beginning in which users are those who value the quality of news under your point of view. Each of these services has (or in the case of Digg, had) an algorithm to manage positive votes and (if any) negative votes, something that makes the Community own those content moderator.
That raises several problems. The first, the groups of users that they try to “tweak” the system and run it for their own benefit, whatever it is. Many of these services have often had problems with that kind of multi account users – a problem that also suffer in reviews of this and other media systems – who try to act as pressure groups to favour or prejudice certain content. The so-called ‘Mafia’ and the ‘clustering’ of users are created in those groups that are created automatically either in favour or against as all those subgroups that are displayed in accordance with certain specific aspects of a same question.
There is not only this problem, but which also makes that when we see the voting occurs that in effect our perception of them is already conditioned. We don’t appreciate the news in the same way if a lot of people has rated it positively or negatively. Adding to this problem is the fact that people who dedicated more time to this system is that manages to sharpen the type of content that just appearing on those sites. These social news sites must also strive to avoid being another example of the so-called ‘principle of Peter’, which reveals the following:
People who perform their work well are promoted to positions of greater responsibility, to the point that they reach a position where he can not formulate even the objectives of a work, and reach their maximum level of incompetence
Avoid all these problems and take advantage of the wisdom of masses in a consistent way is very complex, and in fact even in the anarchic website par excellence, Reddit, have been safe from these problems. A few days ago was it discovered as its founder, Steve Huffman, had used his “superpowers” administrator to modify the comments of support to Trump, something that has now become very bad place both Huffman and the reputation of independence of Reddit.
Facebook and how detector false news community
Facebook is therefore facing a daunting task. Users can mark any news to appear in their accounts as false, and from then there will be a number of researchers, full-time employees on Facebook, which will appreciate if after all those votes actually a story can be described as false or suspect of being.
From there other mechanisms will fire: there will be some organizations that collaborate in the task of validating and checking the facts, and if false news are detected they are marked as suspicious. Will not disappear from the system, but if a user wants to share them and “propagate the lie” you will need to do so after being warned in a clear and patent of what you are doing.
May that work, but the dimension of Facebook is so daunting that false news test tasks they could be completely unsustainable. Aaron Sharockman, CEO of Politifact (one of the organizations that collaborate in this effort) indicated that “there have always been things that verify the facts that people to do so. I don’t expect that to change”. Even so, is optimistic with this kind of function.
But the problem that does not speak Sharockman – or Facebook – is precisely how to give that power to users, although interesting, you can finish creating Facebook the same problems that communities like Reddit, Digg or Menéame have had in the past. Dealing with this is especially complex, so it will be interesting to see how events unfold. The true and false.