By Mary Schlangenstein
Facebook Inc. allowed researchers to manipulate what almost 700,000 readers saw on the website in a study that revived concerns about how much privacy online users can expect.
Researchers altered the number of good or bad comments posted in the usersí news feeds, which also contain photos and articles, for a study published on June 17 in the Proceedings of the National Academy of Sciences. People shown fewer positive words were found to write more negative posts, while the reverse happened with those exposed to fewer negative terms, according to the trial involving random Facebook users in January 2012.
The data showed that online messages influence readersí "experience of emotions," which may affect offline behavior, the researchers said. Some Facebook users turned to Twitter to express outrage over the research as a breach of their privacy.
"Facebook knows it can push its usersí limits, invade their privacy, use their information and get away with it," said James Grimmelmann, a professor of technology and the law at the University of Maryland. "Facebook has done so many things over the years that scared and freaked out people."
Even so, the anger wonít have a long-lasting effect, Grimmelmann said. While some users may threaten to leave Facebook, most people "want to be where there friends are" and there is no alternative to the social networking site that provides more privacy, he said.
Face to Face
In the study, the researchers, from Facebook and Cornell University, wanted to see if emotions could spread among people without face-to-face contact.
The Facebook study is "really important research" that shows the value of receiving positive news and how it improves social connections, said James Pennebaker, a psychology professor at the University of Texas. Facebook might have avoided some of the resulting controversy by allowing users to opt out of taking part in any research, he said.
"It will make people a little bit nervous for a couple of days," he said in an interview. "The fact is, Google knows everything about us, Amazon knows a huge amount about us. Itís stunning how much all of these big companies know. If one is paranoid, it creeps them out."
Facebook said none of the data in the study was associated with a specific personís account. Research is intended to make content relevant and engaging, and part of that is understanding how people respond to various content, the Menlo Park, California-based company said in a statement.
"We carefully consider what research we do and have a strong internal review process," Facebook said. "There is no unnecessary collection of peopleís data in connection with these research initiatives and all data is stored securely."
Among the studyís authors, Adam Kramer, a Facebook data scientist, and Jeffrey Hancock, a Cornell professor in the communications and information science departments, didnít return calls or e-mails seeking comment. Jamie Guillory, also with Cornell, sent an e-mail referring questions to Kramer.
Susan Fiske, a psychology professor at Princeton University, edited the study for PNAS. She contacted the authors and was told it passed Cornellís human subjectsí ethical review. The data had already been collected when the Cornell researchers became involved.
"From that point of view, this is an issue about Facebook, not about research ethics," she said in an interview. "My own decision was not to second-guess the Cornell" review board.
"People are relating to Facebook as if it has betrayed their trust," she said. "The level of reaction is understandable. That doesnít mean what Facebook or Cornell did is unethical."