FBMANIPULATESU

Beyond Emotional Contagion: All The Ways Facebook Is Using You

Image courtesy of mkhmarketing via Flickr, modified by Curiousmatic.

In an experiment on emotional contagion, Facebook aimed to manipulate users’ emotions by algorithmically arranging positive and negative news feed posts, and measuring responses in a sample of over 600,000 users.

The study, conducted in partnership with Cornell researchers, was published in the Proceedings of National Academic Sciences (PNAS), found that when users read positive posts, their own updates were neutral to positive; and neutral to negative when exposed to negativity.

Outside of its findings, the experiment as a whole also served to elicit sad, angry, and distrustful responses across the board. A few key aspects have since been revealed:

Long before the news broke, however, Facebook was already manipulating users, in ways often unbeknownst but technically both consensual and legal.

How you agreed to experimentation

Facebook’s Data Use Policy specifically states the following:

we may use the information we receive about you: … for internal operations, including troubleshooting, data analysis, testing, research and service improvement.

Granting us permission to use your information not only allows us to provide Facebook as it exists today, but it also allows us to provide you with innovative features and services we develop in the future that use the information we receive about you in new ways.

How Facebook ads and algorithms affect you already

In our earlier and still-relevant piece on Facebook advertisement targeting, we described how the information users spew forth on Facebook is churned for advertising purposes: advertisers target users by age, location, interests, relationship status, and even browser tracking cookies to keep ads focused and relevant.

Brands, companies, and agencies perform studies like Facebook’s all the time – what is advertising, after all, if not attempting to elicit emotional response without consent, and tracking results?

And unlike Twitter and Instagram, Facebook already algorithmically decides what to show you on your news feed by prioritizing the visibility of popular posts, paid advertisements, and posts not linking outside Facebook. Google operates similarly to determine best search results.

Such social influence has been proven strong before: for example,a 2010 experiment that boosted a social post reminding users to vote in congressional elections resulted in an estimated 340,000 extra voters.

Informed consent in the digital age

As Facebook doesn’t receive federal funding for its research, it’s legally exempt for the “Common Rule” for experimentation, which states that human subjects should be informed of experiments’ details and risks of discomfort, among other things.

Facebook has claimed that an internal review was conducted carefully for the emotional contagion experiment. But Cornell and PNAS are held to higher standards.

PNAS, for example, has its own guidelines for contributing authors. Requirements include:

  • approval by the author’s institutional review board (the study was apparently approved)
  • For experiments involving human participants, authors must also include a statement confirming that informed consent was obtained from all participants (authors only alluded to FB’s data policy)
  • All experiments must have been conducted according to the principles expressed in the Declaration of Helsinki, which mandates subjects be informed of risks (authors have argued consent was implicit)

But some would counter criticisms by claiming that there is a difference between physically experimenting on human individuals in a lab setting, and that studying how already existing emotional stimulus (newsfeed posts) minimally alter the moods of the masses in an A/B style is typical of the tech business.

The takeaway

At this point in time, most users have a basic understanding of how advertisements work and what they want from us. Likewise, Facebook’s algorithm tweaks or “improvements” remain unpublished and unnoticed, leaving nothing to complain about.

But when a study is designed to, say, make a person feel depression, rather than buy cat food– and we only know about it after the fact — that naturally feels a bit more like a personal betrayal.

After all, one can opt out of a psychology study without losing anything. On the Internet, more and more the only way to opt out of becoming a guinea pig is to stop using a product — which at least in this case, considering users’ proven emotional ties to social media, seems unlikely to happen.

Here’s the study in full — let us know your thoughts by tweeting @curiousmatic, and vote below to convey how much we helped you understand this topic.

We measure success by the understanding we deliver. If you could express it as a percentage, how much fresh understanding did we provide?
Jennifer Markert