At the beginning of last week, Donald Trump, against what they said all the polls, won elections to President of United States. A few days later, and after pressure from some media, Mark Zuckerberg gave an interview in which he defended his Facebook and the false news that they had had nothing to do with the results of the election. Why?
Buzzfeed published in October an own research in which, according to its data, Facebook “on the right” political pages shared false or inaccurate information, 38% of the time. “Left-wing” pages were the same 20%. Many times looking for the spectacular and fast “click”, stories false as this were shared by thousands of people and, therefore, be seen by still more. Even the Washington Post assembled a section in which dedicated themselves to deny fakes.
But the Internet is full of false stories (and if you don’t believe it, check out the Engadget Fakes hunters)… What are these different? The American media coverage of this news, statistics have repeated ad nauseam: 44% of Americans read news on Facebook.
Joshua Benton, Nieman Lab (Harvard University), has been one of the journalists most critical with Facebook since the elections and explained it as well:
[…] “Facebook has become a cesspool of disinformation. “Part of it is motivated by ideology, but much is motivated by the economic incentive that Facebook has created: false things, when they connect with preconceived notions or the sense of the identity of a user, spreading like a wild fire (and are much cheaper to make than the real news).” (Joshua Benton, Niemal Lab)
For Benton, Facebook has built a platform that favours the spread of lies and “there have been many people who had voted in these elections because they were angry about made-up things that had read online”. Zeynep Tufekci, of the University of North Carolina, is studying the impact of technology on society, and told the New York Times the following example:
“A false story said that the Pope Francis, who in fact is an advocate for refugees, had supported to Mr Trump was shared almost a million times, probably being seen for tens of millions.” Correction only had impact. Of course, that Facebook has had a significant influence on the results of these elections”(Zeynep Tufekci, University of North Carolina), the Pope did not support to Donald Trump. In fact, the web site that publishes it says that most of its articles “are satire or pure fantasy”
But in a more discreet way, other prestigious media also criticized the social network. For example, in this story from the Washington Post, they referred to it as well:
“While Facebook has become the home favorite online of many Americans, also has become the host of a variety of partisan content generators that publish mountains of inaccurate or directly invented stories that have been explicitly designed to be massively shared between those people who are more likely to believe in them” (The Washington Post)
Zuckerberg then went to responding to accusations as these such which mentioned at the beginning of the article: explaining that for “voters make decisions based on his experience” and that he believed that “there is some deep lack of empathy that the only reason why some have voted the way that voted is because they saw some false news”.
Would Zuckerberg why insists so much?
The controversy doesn’t end there. Perhaps because the subject is “journalistic” or because some are still looking for possible explanations to the unexpected victory of Trump, numerous related reports have appeared in the media since then. Mark Zuckerberg returned to speak out, ensuring that the “99% of the content that people see is real” and that only a small part is false (part that Furthermore and according to him, is not always related to the policy).
By which so much emphasis by Zuckerberg to leave course that Facebook, according to him, had not influenced the elections? Apparently, and secured the New York Times after speaking with company-related sources, were several executives of the social network that a first they questioned the role of the platform and the responsibilities it should take.
Moreover, in Gizmodo they claimed soon after, after speaking with (anonymous) employees of Facebook, the social network takes time to debate this issue and who even came to consider an update of the algorithm for detect and subtract visibility to false news. This update, according to these same sources, was discarded since it will impact especially to the visibility of the contents of sites “on the right” and for fear of the reaction of the conservatives. Facebook denied it.
In fact, and according to Buzzfeed, there is a group of “renegade” Facebook employees who have formed a “special command”, without the authorization of the Chief, to fight against the false news since this denied the influence of them. “At the moment gather in secret so its members can freely talk” with the idea of “making a list of recommendations for the senior managers of Facebook”, says Buzzfeed, which also says that There is some discontent among “hundreds” of employees with regard to the treatment of the social network has given the false news and the subsequent explanation of Zuckerberg.
It is not the first time that the social network He starred in some controversy regarding the policy. In May of this year published an article about how the staff to choose the “issues of the moment,” were biased and relied on the vision of the employee who chose them, having supposedly “conservative” issues less likely to go out in that section. The matter came to the Republican party, which released a highly critical regarding this letter, and to own Senate, who called for an investigation. Facebook always denied this, but shortly afterwards dismissed the team and replaced by an algorithm.
Is not the first time that Facebook gets into trouble politically speaking: in the past has already had a major clash with United States Republicans
Another criticism that you frequently receive is that relating to the “bubble of filters”: Facebook is designed to show you what your algorithm believes relevant for you. That’s why, if you’re conservative, It is likely that you see in your timeline conservative news. What it has to do with the false news? It is not only that they do not offer you other “view”, it is that, if someone published a false news in a middle of certain policy preferences, it is unlikely in your Facebook feed to see how other means of other orientation belies it. And vice versa.
Facebook goes into action
Facebook fight against the false news is not new. In January 2015, the social network announced a new system in which users were responsible for marking a publication as a fake. “We cannot read and check everything, so what we’ve done is allow people to check things like false”, explained the head of the news feed from Facebook to TechCrunch the month passed before all the controversy.
Facebook has not been the only online platform that the accusing fingers have pointed to in regards to the dissemination of false news. This weekend, a false information on the number of votes that each candidate had been strained as first story on the results when searching by “final vote count 2016”, “final election numbers” or similar. This is not true. However, and somehow managed to an unknown web page (70News) put your news as first result for some as common as these searches. In fact, get the proof: still keeps coming up at all.
After the media coverage of the case, the search engine announced yesterday that they were going to take action against sites that disseminate “false news”. In particular, they will stop serving ads (and, as I said before in the case of Facebook, cut revenues through that way) to pages that “misinterpret, erroneously declare or conceal information about the publication, the contents of the publication or the main purpose of the web property”.
Facebook and Google, drivers of what is true and what not?
If the false news of Facebook was something that influenced voters or not is something that possibly we don’t know ever (although I’m sure that there will be studies and academic publications on the subject). Now, must lie on Facebook, Google and other online platforms responsible for ‘filter’ what is false and what not? In the end it comes down to a debate that already we have heard on several occasions: is Facebook a means and, as a consequence, has responsibilities?
Speaking of Facebook has always been clear in this regard: “We are a technology company, not a media company,” said Zuckerberg this summer. That is so because, according to him, “we build tools, do not produce any content”. Even so, in the past Yes they have taken “editorial” decisions. For example, they came to delete their posts the famous photo of Viet Nam that stars a nude girl because it did not comply with its policy, but then they returned it to allow to consider that it was “of concern”. They recognized the then make “exceptions” in its rules for what can be published and not.
Is Facebook a medium or a platform? They say that platform, but have already taken editorial decisions in the past
If we go back to the Studio that Buzzfeed I mentioned at the beginning of everything, they obtained the analyzed page data that are represented in the graph that I leave here. Yes, the “left” and “right” pages have a high percentage of “mostly false” content, but also the “mainstream” have a small percentage of “mixture of true and false”. For the study, decided to consider as “mixture of true and false” news items “written only from anonymous sources or unverified claims”. And, as an example of one of them, they refer to this political story about how George H.W. Bush was going to vote for Hillary.
“Our team will closely review all potential editors and will monitor closely the current to ensure that they comply,” said Facebook a few hours ago. When a story not verified can be considered a false story? How many news media have to be qualified as “false” so that Facebook (and Google) expelled them their ad platform? What percentage? If you will apply a filter as well, they will have to find a scale that apply if you wish to apply these new rules. But which one?