Quantcast

Did Facebook Cross The Line This Time?

    June 30, 2014
    Chris Crum
    Comments are off for this post.

Facebook went and freaked a bunch of people out again. They were about due, weren’t they? This time, the freak-out comes from an academic study of all things, looking at how Facebook can manipulate users’ emotions based on the posts they choose to show in the News Feed.

Some people feel Facebook has crossed a line here, while others essentially consider it par for the course on the Internet of today (not to mention on Facebook itself).

Are you comfortable knowing that Facebook can potentially alter your mood by showing you certain types of posts? Did Facebook cross the line? Share your thoughts in the comments.

The study is called “Experimental Evidence Of Massive-Scale Emotional Contagion Through Social Networks”. The abstract explains:

Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. Emotional contagion is well established in laboratory experiments, with people transferring positive and negative emotions to others. Data from a large real-world social network, collected over a 20-y period suggests that longer-lasting moods (e.g., depression, happiness) can be transferred through networks…although the results are controversial. In an experiment with people who use Facebook, we test whether emotional contagion occurs outside of in-person interaction between individuals by reducing the amount of emotional content in the News Feed. When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks. This work also suggests that, in contrast to prevailing assumptions, in-person interaction and nonverbal cues are not strictly necessary for emotional contagion, and that the observation of others’ positive experiences constitutes a positive experience for people.

“We show, via a massive (N = 689,003) experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness.,” the researchers say in the “significance” section. “We provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient), and in the complete absence of nonverbal cues.”

You can dig in here.

Naturally, people are a little uncomfortable with Facebook taking these liberties.

It’s important to keep this in perspective though. They did this with a reported 0.04% of users over a single week two years ago. That might not make you feel any less dirty, but chances are if you weren’t included, and even if you were, it was a long time ago, and likely of little significance to you now other than a general creepy feeling. Of course, you never know when they’re running any kind of experiment, so things like this could really happen at any time without notice.

Facebook’s Adam Kramer, who co-authored the study, took to Facebook to respond to the outrage:

The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook. We didn’t clearly state our motivations in the paper.

Regarding methodology, our research sought to investigate the above claim by very minimally deprioritizing a small percentage of content in News Feed (based on whether there was an emotional word in the post) for a group of people (about 0.04% of users, or 1 in 2500) for a short period (one week, in early 2012). Nobody’s posts were “hidden,” they just didn’t show up on some loads of Feed. Those posts were always visible on friends’ timelines, and could have shown up on subsequent News Feed loads. And we found the exact opposite to what was then the conventional wisdom: Seeing a certain kind of emotion (positive) encourages it rather than suppresses is.

And at the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it — the result was that people produced an average of one fewer emotional word, per thousand words, over the following week.

The goal of all of our research at Facebook is to learn how to provide a better service. Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.

While we’ve always considered what research we do carefully, we (not just me, several other researchers at Facebook) have been working on improving our internal review practices. The experiment in question was run in early 2012, and we have come a long way since then. Those review practices will also incorporate what we’ve learned from the reaction to this paper.

Feel better about it now?

As others have pointed out, Facebook’s terms of service pretty much allow it to do this type of stuff as it pleases. Unfortunately, it has come to light that Facebook made changes to its terms to cover the experiment four months after it actually conducted.

Sam Biddle at Valleywag writes, “The most valuable lesson for the company might be that it can keep creeping us out and violating its customers, over and over again, and none of us will ever delete our accounts. I’d love to read that study.”

Let’s just hope nobody involved in in the experiment killed themselves. It certainly wouldn’t be the first time we’ve heard of suicides related to Facebook.

At least it’s probably in Facebook’s interest to keep you happy rather than depressed. If you read too much depressing stuff on Facebook, you might decide you don’t want to use it so much. And that would of course mean that you won’t click on ads.

Facebook continues to tweak its algorithm based on your behavior. While the study was conducted in 2012, just last week, the company announced changes to how it shows videos to users. Facebook knows how much video you’re watching, and will show you more if you watch more and vice versa.

In reality, as an internet user, you’re subject to all kinds of tests from the various services you use, not to mention advertisers, at any point in time. It’s the trade-off we make in exchange for the sites and apps we use every day.

Is using Facebook worth letting them decide what content they want to show you at any given moment? Let us know what you think.


  • http://www.sbwebcenter.com/ Steve B

    I think as long as they’re not using psychology for ill reasons, it’s ok. Marketing involves a lot of psychology to sell.

  • Martin Sutherland

    Sucks.

  • http://www.bloketoys.co.uk/ BlokeToys.co.uk

    This was entirely unethical, immoral and should result in criminal proceedings. FB and the universities involved in this “study” (lets not fool ourselves, this was a direct attempt at mind control by a sick corporation desperate to make money at all costs) should be dragged in front of a court and giving a f**king beating.

    Advertisers have long been attacked around the world to abusing customers in various ways, but this is taking it to a whole new level of immorality.

    Manipulating and studying people without their consent is psychopathic. Whoever was involved in this should be facing prison time, and FB should be paying out millions in damages to those they abused.

  • Khon

    Of course, if you don’t use FB, there is no problem. Getting people to abandon it is the hard part.

  • Fred Smoot

    Face fuckbook.

  • Chris McElroy

    It just shows more evidence that FB lacks credibility, especially as a business platform. What you see is whatever FB wants you to see and if a business has no access to the data they collect, then it’s worthless to the businesses who use FB.

  • http://www.wix.com/polalor/republic Peter J. O’Lalor, Ph.D.

    I deleted my FB account following their updated “Privacy rights notice.” It was apparent from this new policy, If I were to remain, would give FB, a lot more influence that seemed warranted. Sorry i can’t remember the details, but it was definitely because of the new “Privacy” notice, about a month ago.

  • Bariebel

    Facebook should know that trying to apply mind control is unethical, immoral and most of all criminal thus should be reported to the law enforcing agencies.

    They could drive havoc with kids minds and maybe even with immature not stable minds of adults, for that matter maybe with all of us. Most frightening!

    Who knows where this could end up in the future, an extremely dangerous play indeed in particular ending up in the wrong hands, if not stopped. It could create a most adverse life style conditions. Right now maybe just an innocent dabbling by Facebook but use your imagination for a moment, science fictions could come true in the wrong hands.

  • Brady Harness

    I Believe this was a Bad Mistake on behalf of FaceBook; but just another step for Giant Corporations to practice Mind Control like in the book 1984!!!

  • http://www.backwaterstudio.com Kathleen Johnson

    Any Facebook algorithm that alters the organic News Feed impacts the “user” in some way. So, they used one to manipulate the feed and watch the barely less than measurable results – not a rousing impact on the user despite the drama ensuring with “some” who use Facebook.

    Personally, I thought it was a constructive use of a Social Medium and the study had a lot of relevancy in today’s society.

    I applaud Facebook for the study and I hope they are transparent enough to share the outcome and results of other such studies. Of course, you have to know they have done that.

    In this case, keep up the good work Facebook. You have the power and scope to make these studies and use the results to further implement a sounder and responsive platform.

  • Loki57

    What can be done to penalize the guilty, when Facebook conspires with 3
    arrogantly disgusting big name schools, to impose a psych study on
    select users with no prior notice or consent, and then changes its TOS
    months later to pretend those users agreed all along?

    Sounds like commercial fraud there. The study itself, on manipulating
    emotions by tampering with algorithms, does not note IRB approvals at
    any of the scofflaw schools, those being UCSF, Cornell, and Princeton.

    How can undisclosed compliance with, or violation of, HHS IRB rules for schools receiving Federal funds, be confirmed? (US Health and Human Services, who administer specific Federal law requiring Institutional Review Board approvals of ethics and other criteria prior to the start of any such human subjects study)

    http://www.pnas.org/content/111/24/8788.full

    A $5000 court ordered payment to every user of Facebook, treble or more
    that to direct subjects, and a year of suspension and 3 years of academic
    probation for the school participants, would likely get noticed and
    inhibit recurrences. It’s unlikely anything serious will result, but what if
    what was done was criminal, plus civil fraud, as appears likely?

    What other potential legal or regulatory issues exist there?

    This is NOT just normal marketing, when contract fraud and Federal funds recipient universities are involved.

    ==

    In case anyone’s interested, I’ve never trusted or maintained a Facebook
    account, but have a few times used sock puppets to investigate specific
    short term concerns.

  • http://www.spainnewsinenglish.website/ Anne Sewell

    I sincerely wish I could just close my Facebook account and tell “Suckerberg” to go f*ck himself. Unfortunately, as so many others know, you need the damn platform to share and get business. As for the news feed, Facebook give us back the normal feed, i.e. as our friends share, we see the posts in chronological and logical order! I have missed so many important posts since they changed things. It really sucks. While on the subject, I am also sick of them trying to profile us by trying to find out which books, movies, and other endless sh*t we like.

  • Martin Sutherland

    The world is not a laboratory and humans are not rats. They are amoral control freaks who think they can manipulate reality.

  • w1z111

    Hmmm…ok, I think I’m alright with this, to a point. Hopefully, the FB team(s)
    behind the studies are aiming toward something ‘positive’ in their efforts to
    better understand (and better ‘manipulate’?) users’ attitudes, moods, and other
    online attributes and behaviors. There could be some real goodness in
    such power, if used appropriately. Indeed, Facebook could become a real
    catalyst for positive change in our world…if they can use their influence
    toward that end.

    On the other hand, we know it could go the opposite way, if FB decides to use
    their power for ‘evil’ purposes like trying to instill negativity in users for
    the sake of promoting or selling advertisements; or if they try to alter a
    positive atmosphere and mood toward something more negative.

    Time will tell, I guess…I have hopes that Mark Z. and all his folks are ok people,
    and are being motivated by something much more important and profound than
    simple cash or wealth or fame, though I know that’s a lot to expect.

    I’m just sayin’…

  • Loki57

    I checked with a lawyer friend, with a close relative who is an admin in one of the named university systems. He suggests that even had Facebook done its ToS change prior to this study, that could not constitute “informed consent” for a human research subject, and so no valid IRB could approve this study design. Class Action time, as by definition, all subjects legally are victims even if direct harm is minimal, and all Facebook users have cause to demand notification of whether or how they were used, and claims against Facebook and the universities.

    As another friend who used to head at times an IRB and a med school ethics panel taught me, there are many treacherous ways to design studies to not use human subjects, but cause effectively identical data set review. These researchers appear to be immature idiots relative to those professional skills.

    I’ll admit I didn’t clearly understand those ethical and legal issues until I had gray hairs. Young, aggressive, publish or perish academics have an obligation to know and comply with these standards, even if business sector marketing and market research people do almost identical actions daily.

    How could Netizens, here and beyond, educate Facebook users about how and why this appears to constitute legal violations against them, where it appears Facebook, 3 universities, and named study staff and other John and Jane Does 1-100, qualify to be on the wrong side of litigation? That education about ethics and law could be more valuable than this study itself.

  • William MATAR

    Since time and big time am using Google Plus which far more beautiful and more clean than the spamy fakebook

  • oohdale

    I think if I would receive 25,000 dollars then I would be ok. Other than that and I will sue.

  • Loki57

    See Chris’s follow up article, and my comments to it, for added facts and sources that answer some of the questions here about who is (and isn’t) guilty of what unethical or illegal acts, as well as more issues of our society, ethics, and law.

    http://www.webpronews.com/actually-facebook-changed-its-terms-to-cover-that-experiment-after-it-was-over-2014-07?utm_source=Social+Media_sidebar

    It appears, between Forbes investigation, and the study itself, that international English speaking Facebook users were broadly selected, AND that some of those users likely were ages 13-18 if not (by lies to not be cut off from friends) younger. Also, Cornell’s IRB has serious issues, while UCSF was not involved other than as a future employer of one Post-Doc. That both expands and narrows serious legal and ethical issues.

  • Mir

    Wrong doing. You cannot use human beings as research subjects without their consent no matter the purpose. Especially one related to the individual’s psych bechavior when there is no security for privacy or confidentially. Totally against it. There should be legal consecuences.

  • Klaus Kaufmann

    I agree with Khon; Facebook has broken many promises and is ever more just another marketing tool so they can make lots of money on selling ads… but FB is hard to quit! We got addicted!

  • http://www.socialrep.com/ Chris Kenton

    Apologists for FB keep saying it was a small study, a long time ago, on just a fraction of FB’s user base, get over it. That isn’t the point. The point is that FB conducted psychological experiments on its users without consent.

  • Guest

    Crossed the line, deleted my Facebook acct.

  • guest

    The academics involved knew exactly what they were doing. You are not allowed to use human subjects for research purposes without their consent—so they skirted the process and got their data anyway. Shame on the institutions!

  • Mark Meixner

    Since Human is Human, the people have the ability to control the life, or control the people. In our case, the fact is, that all of our life is manipulated. The thing is, that the way, how the children are manipulited in the scools and how the people are manipulated by the media, is much worst, than the fact, that facebook manipulate the news feed.

  • Mike

    There’s a difference between standing on the corner observing how many train wrecks occur, and standing on the corner altering the operation of the warning signal while observing how many train wrecks occur. A study doesn’t have to create groups with overall positive or negative feeds to make observations. It can be designed to find representative positive and negative groups and then correlate the resulting emotions in their posts over the following week. It sounds like a lazy experimenter.