Facebook May be Performing Non-consensual Experiments with Users Since 2007

One of the most discussed topics of the past week was the discovery that Facebook conducted an experiment in 2012 with more than 600 thousand users on the sly. Given the monstrous impact of it, Sheryl Sandberg, chief operating officer of the company, came to apologize for the “failure of communication”. But perhaps little: according toWSJ.com, Facebook has been experimenting with its users since 2007.

In the case of 2012, the researchers altered the content displayed on the users timeline to try to understand if certain information flows can cause emotional impact. The result was positive: timelines loaded with more negative posts have caused many users to increase the number of melancholic or pessimistic posts, for example.

At first glance, it might not even seem like nothing that serious. The problem is that many people have had their emotions manipulated, ever so slightly, without even suspecting that your timeline served as a trigger for it.

Sources close to Facebook said the WSJ.com , however, that this is just the tip of the iceberg: according to them, hundreds of experiments has been carried out without the knowledge of users since 2007, the year in which the company has assembled a team of “Date Science “consists of software engineers, artificial intelligence experts, psychologists and other professionals.

One of the respondents revealed that tests were made so often that some researchers came to fear that the same user groups were used in more than an experiment and thus the consequences of interfering in the results of another.

It would have happened manipulations to determine how families communicate within Facebook, for example. In other tests, researchers have attempted to identify causes for loneliness. The sources reported even experiments to analyze the impact of messages with political content in user behavior.

The most appalling experience may be a supposedly held for about two years: Facebook would have blocked the account of thousands of users and sent emails informing them that the reason was the suspicion that their profiles were false.

To have your restored access, users had to follow the message of procedures in order to prove that their records were real. It could just be a check security, not a detail: Facebook knew that these accounts were legitimate. The company would have caused this disorder only to evaluate the effectiveness of their anti-fraud systems.

While the terms of Facebook deem margin of use for procedures like this, the accusations are worrying – experiments done so deliberately, can worsen the emotional state of a person hit by a more serious problem, for example, however remote it is this possibility.

The subject is controversial and subject to much debate, after all, even with the terms of authorizing use, it is ethically expected that Facebook get explicit consent of users, as do university researchers.

The fate of Facebook, so to speak, is that the results of these experiments were not published in papers such as the 2012 study, therefore, is difficult to prove their achievement. Anyway, it will not be strange if no Sheryl Sandberg, Mark Zuckerberg or any other of the firm’s executive trying to clarify the matter in the coming days.