A privacy group Electronic Privacy Information Centre (EPIC) has filed a complaint with FTC regarding conducting a social experiment with 700,000 of the site’s users back in January 2012. To test the effects of positive or negative status updates on user responses via Facebook, the company altered its site algorithm to impact these users – for better or worse.
What we’ve now learned in the last twenty-four hours is that Facebook’s 2012 experiment is but one of quite a few social experiments conducted by the company as far back as the birth of its social networking site. In other words, Facebook users have been “virtual lab rats” for some years, and some of those years had passed before Facebook implemented an “internal operations” clause into its data user policy.
Facebook claims, however, that it’s done nothing wrong, citing the 2012 clause that didn’t appear in the policy until four months after the study was done. The question becomes, in such a situation, did Facebook violate user policy and ethics – or is Facebook’s virtual experimentation nothing more than business as usual?
The dilemma over virtual experiments
Virtual experiments are performed online every day, and many customers or users are unsuspecting that they’re happening. There are a number of things that are said on web sites of all kinds, things that are horrible to say and disrespectful of other human beings – but they’re said, regardless of how it makes the intended recipient feel. Online jobs come with hires and fires, and there’re a number of clients who hire and fire someone without the slightest regard for the individual’s income or livelihood. These things happen, and unfortunate victims can do little to seek justice in such situations.
With this common environment of what can often be seen as verbal abuse, Facebook believes its actions weren’t probing, nor do they think that their actions can be attributed to “emotional abuse” of any kind. In short, the company believes that, when you use its services, you agree to be a subject in the virtual world. Whatever happens on-screen that affects you is no different than the company changing its user policies without your consent. Facebook should alert users when the policy changes and when it seeks to make you a social experiment participant – but the company doesn’t feel that it needs to inform its users about such things.
While this may be Facebook’s reasoning over the matter, the company’s own actions indicate otherwise.
Facebook’s social experiment as an ethical breach
Doctors can believe that a patient needs certain tests run – but they can’t test a patient’s body (whether blood, urine, etc.) without an individual’s consent. Even if the individual in question is unable to make decisions for himself or herself, doctors can’t test the patient without the consent of the patient’s power of attorney (in most cases, a relative). In other words, consent is key to any experiment or test.
Just because Facebook is in the virtual world (and so are its users online), the company doesn’t deserve a blind pass.
After all, it’s true that users were involved in a few social experiments that span a 5-year period prior to the internal operations clause inclusion in the 2012 Facebook policy. And, as with any testing, it’s only fair that the subjects involved (us) give our consent. It doesn’t give Facebook the right to subject us to unknown experiments in the name of “virtual testing,” any more than a person has a right to stalk us on the World Wide Web because they’re not causing physical harm. Since we are human beings, unlike lab rats, Facebook must deal with us based on our consent. If we were true lab rats, worldwide Facebook users wouldn’t be rising up in anger. Our consent wouldn’t matter if we weren’t human beings with an intellect and rights and liberties.
Facebook, however, isn’t only guilty because it assumes that virtual testing makes us default “lab rats” from whom no consent is needed; the company’s also guilty because it eventually included an internal testing operations clause. In short, the company included it because it’s likely the case that Zuckerberg’s company was aware of the legal loophole that would make it liable for a lawsuit. To cover its tracks after the fact, Facebook decided to add the phrase four months after conducting a study in January 2012.
Notice, however, that Facebook didn’t seek to inform users at the time that a recent experiment had been performed by the company and its partnering researchers. Instead, the company added the clause (quietly) and went on its way. This amounts to an ethical breach, with an after-the-fact remorse that resulted in the “sealing” of legal “leak” that would’ve put the 1-billion-user networking site in boiling hot water if some Facebook user ever checked the data user policy (or the lawyer of a Facebook user, for that matter).
Facebook can say what it will, but the company is guilty of performing an illegal social experiment on users when nothing in the company’s policy in January 2012 informed users of the internal testing to which they were being subjected unconsciously. Then, when pressed, the company pointed back to its clause added four months later. It just seems as if Facebook has its share of guilt, but the later-added clause was put in the policy not to inform users – but to simply cover its tracks.
In short, Facebook cares about Facebook…not about you. Maybe it’s time we get that message and respond accordingly.