Can Facebook Manipulation Spread Emotional Contagion? Professor of Psychology Who Edited The Secret Study Has “Ethical Concerns About FB Methods” Admitting She Was “A Little Creeped Out Too.”


For one week in 2012, Facebook skewed nearly 700,000 users’ news feeds to either be happier or sadder than normal. The experiment found that after the experiment was over users tended to post positive or negative comments according to the skew that was given to their news feed. Facebook manipulated the emotions of hundreds of thousands of its users, and found that they would pass on happy or sad emotions, it has said.

Susan Fiske, professor of psychology at Princeton University, told the Atlantic  and the scientist that edited the study had ethical concerns about its methods. The research suggests that the findings could be used by Facebook to encourage users to post more and/(or?) by other agencies such as governments to manipulate the feelings of certain users in certain countries. It’s ethically okay from the regulations perspective, but ethics are kind of social decisions. The level of outrage that appears to be happening suggests that maybe it shouldn’t have been done… I’m still thinking about it and I’m a little creeped out too.”

Facebook’s ‘Data Use Policy’ — part of the Terms Of Service that every user signs up to when creating an account — reserves the right for Facebook to use information “for internal operations, including troubleshooting, data analysis, testing, research and service improvement.”

The researchers said that constituted the informed consent required to conduct the research and made it legal. The study does not say that users were told of their participation in the experiment, which researchers said was conducted by computers so that they saw no posts.

Source: http://www.independent.co.uk/life-style/gadgets-and-tech/facebook-manipulated-users-moods-in-secret-experiment-9571004.html

In other news.

Facebook: good for democracy or a way to wage psychological warfare on voters?

Research shows that ‘likes’ could predict sexual orientation, ethnicity, religious views, intelligence, happiness and political beliefs.

https://www.independent.co.uk/news/uk/politics/facebook-good-for-democracy-or-a-way-to-wage-psychological-warfare-on-voters-a7770616.html


ScreenHunter_2607 Jun. 25 20.49

 

Facebook is considering secretly watching and recording users through their webcams and smartphone cameras, a newly discovered patent suggests. Monitoring people’s emotions would help the company keep them on the site for longer.

Facebook is considering secretly watching and recording users through their webcams and smartphone cameras, a newly discovered patent suggests. It would analyse those images to work out how you feel, and use the information to keep you on the site for longer. The document explains how the company would use technology to see how your facial expressions change when you come across different types of content on the site.

If you smiled as you looked at pictures of one of your friends, for instance, Facebook’s algorithm would take note of that and display more pictures of that friend in your News Feed.

Another example included in the patent application explains that if you looked away from your screen when a video of a kitten played, Facebook would stop showing similar type of videos in your Feed.

ScreenHunter_2606 Jun. 25 20.43

TECHNIQUES FOR EMOTION DETECTION AND CONTENT DELIVERY
Abstract
Techniques for emotion detection and content delivery are described. In one embodiment, for example, an emotion detection component may identify at least one type of emotion associated with at least one detected emotion characteristic. A storage component may store the identified emotion type. An application programming interface (API) component may receive a request from one or more applications for emotion type and, in response to the request, return the identified emotion type. The one or more applications may identify content for display based upon the identified emotion type. The identification of content for display by the one or more applications based upon the identified emotion type may include searching among a plurality of content items, each content item being associated with one or more emotion type. Other embodiments are described and claimed.

 

In another case, the document says that if you happened to watch an advert for scotch, Facebook could choose to target you with more adverts for scotch.

The patent application was submitted in February 2014 and published in August 2015, but was only recently spotted, by CBInsights.

“We often seek patents for technology we never implement, and patents should not be taken as an indication of future plans,” said a Facebook spokesperson.

However, the document raises yet more concern about a company that, in 2014, was found to have secretly manipulated hundreds of thousands of users’ News Feeds as part of an experiment to work out whether it could affect people’s emotions.

The company later admitted that it “failed to communicate clearly why and how we did it”. The research has provoked distress because of the manipulation involved.

Last year, a picture posted by Mark Zuckerberg showed that he covers his webcam and microphone with tape. The public rather predictably made a big deal out of it, and the discovery of Facebook’s patent will only fuel speculation.

The site isn’t believed to have put its plans into action yet, and there’s no guarantee that it ever will. The patent also details a new text-messaging platform that would detect how hard you type, and use that information to attempt to work out how you feel.

Source: https://www.independent.co.uk/life-style/gadgets-and-tech/news/facebook-plans-to-watch-users-through-webcams-spy-patent-application-social-media-a7779711.html

Facebook’s newest patent, granted May 25, 2017 aims to monitor users’ typing speed to predict emotions and adapt messages in response.

We took a look at some of Facebook’s emotion-based patents to understand how the company is thinking about capturing and responding to people’s emotional reactions, which has been a tricky area for consumer tech companies but key to their future. On the one hand, they want to identify which content is most engaging and respond to audience’s reactions, on the other emotion-detection is technically difficult, not to mention a PR and ethical minefield.

Links in this post are accessible to CB Insights clients, and clients can also view additional related patents.

good morning facebook todays meeting will last approx 15 hours coffee