For one week in 2012, Facebook skewed nearly 700,000 users’ news feeds to either be happier or sadder than normal. The experiment found that after the experiment was over users tended to post positive or negative comments according to the skew that was given to their news feed. Facebook manipulated the emotions of hundreds of thousands of its users, and found that they would pass on happy or sad emotions, it has said.
Susan Fiske, professor of psychology at Princeton University, told the Atlantic and the scientist that edited the study had ethical concerns about its methods. The research suggests that the findings could be used by Facebook to encourage users to post more and/(or?) by other agencies such as governments to manipulate the feelings of certain users in certain countries. It’s ethically okay from the regulations perspective, but ethics are kind of social decisions. The level of outrage that appears to be happening suggests that maybe it shouldn’t have been done… I’m still thinking about it and I’m a little creeped out too.”
Facebook’s ‘Data Use Policy’ — part of the Terms Of Service that every user signs up to when creating an account — reserves the right for Facebook to use information “for internal operations, including troubleshooting, data analysis, testing, research and service improvement.”
The researchers said that constituted the informed consent required to conduct the research and made it legal. The study does not say that users were told of their participation in the experiment, which researchers said was conducted by computers so that they saw no posts.
In other news.
Facebook: good for democracy or a way to wage psychological warfare on voters?
Research shows that ‘likes’ could predict sexual orientation, ethnicity, religious views, intelligence, happiness and political beliefs.
Facebook is considering secretly watching and recording users through their webcams and smartphone cameras, a newly discovered patent suggests. Monitoring people’s emotions would help the company keep them on the site for longer.
Facebook is considering secretly watching and recording users through their webcams and smartphone cameras, a newly discovered patent suggests. It would analyse those images to work out how you feel, and use the information to keep you on the site for longer. The document explains how the company would use technology to see how your facial expressions change when you come across different types of content on the site.
If you smiled as you looked at pictures of one of your friends, for instance, Facebook’s algorithm would take note of that and display more pictures of that friend in your News Feed.
Another example included in the patent application explains that if you looked away from your screen when a video of a kitten played, Facebook would stop showing similar type of videos in your Feed.
In another case, the document says that if you happened to watch an advert for scotch, Facebook could choose to target you with more adverts for scotch.
The patent application was submitted in February 2014 and published in August 2015, but was only recently spotted, by CBInsights.
“We often seek patents for technology we never implement, and patents should not be taken as an indication of future plans,” said a Facebook spokesperson.
However, the document raises yet more concern about a company that, in 2014, was found to have secretly manipulated hundreds of thousands of users’ News Feeds as part of an experiment to work out whether it could affect people’s emotions.
The company later admitted that it “failed to communicate clearly why and how we did it”. The research has provoked distress because of the manipulation involved.
Last year, a picture posted by Mark Zuckerberg showed that he covers his webcam and microphone with tape. The public rather predictably made a big deal out of it, and the discovery of Facebook’s patent will only fuel speculation.
The site isn’t believed to have put its plans into action yet, and there’s no guarantee that it ever will. The patent also details a new text-messaging platform that would detect how hard you type, and use that information to attempt to work out how you feel.
Facebook’s newest patent, granted May 25, 2017 aims to monitor users’ typing speed to predict emotions and adapt messages in response.
We took a look at some of Facebook’s emotion-based patents to understand how the company is thinking about capturing and responding to people’s emotional reactions, which has been a tricky area for consumer tech companies but key to their future. On the one hand, they want to identify which content is most engaging and respond to audience’s reactions, on the other emotion-detection is technically difficult, not to mention a PR and ethical minefield.
Links in this post are accessible to CB Insights clients, and clients can also view additional related patents.