Facebook Conducted a Weird Social Experiment without Users’ Knowledge

Facebook has been messing with your mind. The social network recently partnered with researchers to intentionally see if they could manipulate the emotions of nearly 700,000 users—without their knowledge. Their massive social experiment (which took place in 2012) showed that by manipulating the algorithm that determines what posts are shown in users’ feeds, they could affect the overall mood of individual users. As Business Insider explains, “The scientists say they found that when people saw fewer positive posts on their feeds, they produced fewer positive posts and instead wrote more negative posts. On the flip side, when scientists reduced the number of negative posts on a person’s newsfeed, those individuals became more positive themselves.” The only problem is, none of the users knew they were being experimented on.

Many media watchers have called the tests unethical, and have criticized the site for not seeking permission—beyond a purposely vague Terms of Use agreement—for involving them. Even a statement released by Facebook further explaining the tests and apologizing for any “anxiety it caused” in an effort to quell the uproar has been seen as intentionally unclear, not addressing some of the concerns of users. (Like, Why exactly was the U.S. Army a backer for this research?) The The Atlantic notes that Facebook evidently changes their algorithm all the time for tests on users, so we may all just be a part of on-going research anyways …

Scroll To Top