A new seven-day study performed in 2012 shows that Facebook decided to use some unsuspecting 700,000 participants in a study that shows the power of “emotional contagion.”

The study, conducted by Facebook, Cornell University, and the University of California-San Francisco (UCSF), featured Facebook tempering with its social media algorithm to alter what posts appeared in user news feeds – whether positive or negative. The goal of the study was to see if the nature of status updates in a user’s news feed would impact their own status updates (whether positive or negative). The result of the study affirmed what researchers believe: depending on the nature of the posts revealed to Facebook users, the same users would turn around and post either positive or negative content on their status updates. “These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks,” the report said.

See Also: Facebook to collect users data for targeted ads and how to opt out of it

Facebook says that its experiment was perfectly legal, but reactions to Facebook’s virtual, unsuspecting experiment have been negative so far. One user tweeted that it was time for him to delete his FB account. Others have called the study “creepy,” “disturbing,” and just plain “evil.”

See Also: Facebook dismantles his team handling Facebook Home

It seems that, in the world of social media and the Internet, users can become “participants” in virtual studies to which they don’t consent. We question Facebook’s idea of what’s legal and what’s not. How many users can be forced into experiments in real time without their consent? What makes this practice acceptable because it’s done behind a computer screen instead of in a doctor’s office or science laboratory?

We don’t want to scare you, but be warned: you may find yourself in a new virtual experiment any day now.