FB1

Chatter: Learning from Facebook’s research misstep

Understanding the big issues at play in Facebook's experiment

facebook-254165_640

Facebook’s been in a lot of bad headlines lately, and for once it’s got nothing to do with the NSA. The social network conducted an experiment on nearly 700,000 users, subtly modifying their newsfeeds over the course of one week in 2012 to show more positive or negative statuses, and logging whether their own statuses became more positive or negative. The result? A lot of angry Facebook users who felt they’d been manipulated without their consent.

(Also: an FTC complaint, a review by the UK Information Commissioner’s Office, and an apology from COO Sheryl Sandberg.)

For the digital advertising industry — which is in some ways a giant experiment with big data and consumer privacy — the controversy was too close for comfort. So here are a few takeaways to help marketers understand the issues at play and avoid Facebook’s mistake.

Duncan J. Watts, principal researcher at Microsoft, in The Guardian
“[W]e are being manipulated without our knowledge or consent all the time – by advertisers, marketers, politicians – and we all just accept that as a part of life. The only difference between the Facebook study and everyday life is that the researchers were trying to understand the effect of that manipulation.

“If that still sounds creepy, ask yourself this: Would you prefer a world in which we are having our emotions manipulated, but where the manipulators ignore the consequences of their own actions? What about if the manipulators know exactly what they’re doing … but don’t tell anyone about it? Is that really a world you want to live in?”

Alex Hern, tech reporter at The Guardian
“[F]or once, the discomfort with what Facebook is doing isn’t about its ongoing war with Google over who can know the most about our lives… I think what we’re feeling, as we contemplate being rats in Facebook’s lab, is the deep unease that what we think of as little more than a communications medium is actually something far more powerful. Facebook has so far successfully presented an image of neutrality … But with this study, Facebook has inadvertently highlighted, to a mostly uncaring community, just how actively it interferes. Up until now, it’s mostly shown off the times when it’s done so with beneficent aims: promoting organ donors, or voters. Now, that mask has dropped.”

Navneet Alang for The Globe and Mail
“[N]ow algorithms are in everything: not just Google searches, but in determining what shows up in your Facebook news feed, what recommendations you get on Netflix or Rdio – or, crucially, what kind of ads you might like to see. … In a sense, then, we are all involved in a grand experiment, as technology companies and entrepreneurs throw things at a wall to see what will stick. If Facebook experimented with the spread of sentiment, tech companies at large are experimenting with how humans socialize, gather information, relate to the physical world and more.”

Nilay Patel for Vox
“[N]o one’s mad about Taco Bell buying ads or promoted posts in Facebook’s News Feed, even though ads are designed to change our emotions. (Just wait until election season really heats up.) What we’re mad about is the idea of Facebook having so much power we don’t understand — a power that feels completely unchecked when it’s described as “manipulating our emotions.” Advertisers paying to change our feelings about products feels clean and familiar; Facebook screwing with your mind just to see what happens feels like playing god. But in the grand scheme of the internet economy, Facebook running a badly-designed test on a tiny fraction of its billion and a half users for one week in 2012 is virtually meaningless when the company is building an entire business around #brand #engagement with #audiences every other day of the year.”

Puneet Mehta, CEO of MobileROI, quoted in Advertising Age
“Most marketing research is opt-in, which is the main issue with this Facebook research, not the delivery mechanism (algorithm-based optimization). Such programs should not just be opt-out but truly opt-in only. Moreover, my results should be shared with me. If I don’t feel good about my results becoming a part of the overall pool, I should also be allowed to withdraw my data at any time.”

Advantage Articles

Canadian marketers find data analytics valuable, but most aren’t using it

A new survey from Marketing and Acuity Ads finds the majority of Canadian marketers say analytics are important, but experts say there are many barriers to adoption

Is GirlsinYogaPants.com truly safe for your brand?

SPONSORED: Ray Philipose reviews the subjective nature of brand safety

The programmatic mindset is not about numbers

Chango's Dax Hamman says programmatic is actually about people, not algorithms

McDonald’s Hope Bagozzi joins IAB Canada board

Marketer joins with Globe's Andrew Saunders and Microsoft's Brandon Grosvenor

Watch this bot generate 10,000 fraudulent ad impressions

Forensiq shows a day in the life of a revenue-stealing ad bot

Juice Mobile takes a fresh perspective on automated ad sales

Juice's signature Nectar platform isn't really programmatic – and it isn't really direct either