Facebook's icky psychology experiment was only worse by degree

Facebook’s recent psychology experiment is, minus the academic ambitions, a classic A/B test. This is the kind of test countless SaaS companies just like Wipster do constantly. We operate further down the slope from Facebook’s behaviour, but not on a different mountain. Is the business of selling always this sinister? Let’s look deeper...

To boil it down, in early 2012 researchers within Facebook collaborated with Cornell University and University of California academics on an experiment that manipulated the newsfeeds of around 700,000 users (by way of a perspective check – that’s about 0.04 percent of Facebook’s current userbase) to display only happy or sad posts.

Facebook’s hypothesis, and this is one we’ve all heard touted anecdotally by those who enjoying railing against Facebook, was that a user who is over-exposed to the positive experiences of their peers is ultimately prone to feelings of negativity.

They’ve taken an admirably thorough approach to user research, but all they’ve done is show the world exactly how to make 350,000 people feel sad.

It’s not gone down well. The resulting, rather ominously titled paper, ‘Experimental evidence of massive-scale emotional contagion through social networks’ seems to have proved the opposite of the original theory, but at this stage that’s probably a moot point.

There even seems to have been a vague presumption of nobility to their pursuit, at least by the rationale of Facebook’s subsequent almost-but-not-quite apology. But the casual admission that ‘in hindsight, the research benefits of the paper may not have justified all of this anxiety’ doesn’t seem to have relieved anyone of the of the by-now familiar suspicion that the social media giant is simply messing with us because they can. We’ve probably all heard the caution that if you’re not paying for something, you’re probably the one being sold. Furthermore, the idea that they consider all of the babble that fills up most social streams to fall into only one of two categories is wonderfully hopeful and highly problematic. Where exactly do cat videos fall in to this binary?

But the general hand-wringing and brow-furrowing shouldn’t be what’s of interest here. What it should give us pause to consider is how we all, to varying degrees, commit crimes of this nature. To get to this page via anywhere but a direct link you most likely made your way through one or more A/B tests, where different content is served up, sometimes entirely different websites, to different streams of visitors. If you’ve ever received an email from us it’s probably only one of several variations you might have received depending on all kinds of criteria that we accumulate during people’s journey through our system. If you clicked on an ad to read this, then we’ve certainly manipulated you just to get here.

Any web-based company should, and probably does, do this. It sounds cynical, but by now it’s completely normal. Admittedly, some take it to a ridiculous extreme, but there’s no better way to test assumptions that, as evidenced by the news-feed experiment, rarely prove to be exactly right.

The ugly truth of it is that it’s nearly impossible to sell anything without some degree of emotional manipulation. The world would be a rather more boring place if it was.

Even video, the space we work in, is not excluded from these kinds of experiments. Major films are extensively tinkered with based on viewer reactions, sometimes long after their initial release (that’s right - George Lucas is not alone on this). Online marketers have tools that allow them to serve up multiple variants of a video to different user segments, with detailed engagement analytics falling like candy out the other end. Wipster often fields requests for features that would allow our users to split-test the review and approval stages of video production in order to gauge reactions to different variants of the same work. Chances are we’ll build this before too long – because we believe in the power of A/B testing.

Our admission of complicity isn’t meant to absolve Facebook of any guilt. They’ve smashed up all sorts of ethical boundaries and potentially even broken the law. But before this episode is consigned as just another chapter of why everyone says they’re going to stop using Facebook but never do, we should take a moment to consider that this area is a blurry minefield of ethics that no one ever seems to have a proper handle on. A/B testing is a powerful tool. When used properly it creates vastly better user-experiences. We should all do our best make sure that happens.