FB1

Chatter: Learning from Facebook’s research misstep

Understanding the big issues at play in Facebook's experiment

facebook-254165_640

Facebook’s been in a lot of bad headlines lately, and for once it’s got nothing to do with the NSA. The social network conducted an experiment on nearly 700,000 users, subtly modifying their newsfeeds over the course of one week in 2012 to show more positive or negative statuses, and logging whether their own statuses became more positive or negative. The result? A lot of angry Facebook users who felt they’d been manipulated without their consent.

(Also: an FTC complaint, a review by the UK Information Commissioner’s Office, and an apology from COO Sheryl Sandberg.)

For the digital advertising industry — which is in some ways a giant experiment with big data and consumer privacy — the controversy was too close for comfort. So here are a few takeaways to help marketers understand the issues at play and avoid Facebook’s mistake.

Duncan J. Watts, principal researcher at Microsoft, in The Guardian
“[W]e are being manipulated without our knowledge or consent all the time – by advertisers, marketers, politicians – and we all just accept that as a part of life. The only difference between the Facebook study and everyday life is that the researchers were trying to understand the effect of that manipulation.

“If that still sounds creepy, ask yourself this: Would you prefer a world in which we are having our emotions manipulated, but where the manipulators ignore the consequences of their own actions? What about if the manipulators know exactly what they’re doing … but don’t tell anyone about it? Is that really a world you want to live in?”

Alex Hern, tech reporter at The Guardian
“[F]or once, the discomfort with what Facebook is doing isn’t about its ongoing war with Google over who can know the most about our lives… I think what we’re feeling, as we contemplate being rats in Facebook’s lab, is the deep unease that what we think of as little more than a communications medium is actually something far more powerful. Facebook has so far successfully presented an image of neutrality … But with this study, Facebook has inadvertently highlighted, to a mostly uncaring community, just how actively it interferes. Up until now, it’s mostly shown off the times when it’s done so with beneficent aims: promoting organ donors, or voters. Now, that mask has dropped.”

Navneet Alang for The Globe and Mail
“[N]ow algorithms are in everything: not just Google searches, but in determining what shows up in your Facebook news feed, what recommendations you get on Netflix or Rdio – or, crucially, what kind of ads you might like to see. … In a sense, then, we are all involved in a grand experiment, as technology companies and entrepreneurs throw things at a wall to see what will stick. If Facebook experimented with the spread of sentiment, tech companies at large are experimenting with how humans socialize, gather information, relate to the physical world and more.”

Nilay Patel for Vox
“[N]o one’s mad about Taco Bell buying ads or promoted posts in Facebook’s News Feed, even though ads are designed to change our emotions. (Just wait until election season really heats up.) What we’re mad about is the idea of Facebook having so much power we don’t understand — a power that feels completely unchecked when it’s described as “manipulating our emotions.” Advertisers paying to change our feelings about products feels clean and familiar; Facebook screwing with your mind just to see what happens feels like playing god. But in the grand scheme of the internet economy, Facebook running a badly-designed test on a tiny fraction of its billion and a half users for one week in 2012 is virtually meaningless when the company is building an entire business around #brand #engagement with #audiences every other day of the year.”

Puneet Mehta, CEO of MobileROI, quoted in Advertising Age
“Most marketing research is opt-in, which is the main issue with this Facebook research, not the delivery mechanism (algorithm-based optimization). Such programs should not just be opt-out but truly opt-in only. Moreover, my results should be shared with me. If I don’t feel good about my results becoming a part of the overall pool, I should also be allowed to withdraw my data at any time.”

Advantage Articles

AutoUpdate: Gannett ads get in your face, just like TV

Plus, slide and skip ads from Israel and Facebook targets based on strength of cellphone signal

For marketers, it’s all about cross-platform: report

Consumers are devouring content across devices, though measurement still has a long way to go

Don’t let your brand be a turkey this holiday season

SPONSORED: Exchange Lab's Nikki Hawke on engaging your consumers online

AutoUpdate: Amazon (not Google) buys Twitch for $970 million

And Yahoo launches a content recommendation widget

PCG building content marketing engine with government grant

New platform to help marketers improve their social and content marketing campaigns as they go

Canadian advertisers spent 62% more on mobile RTB in Q2: Smaato

Canada is the third-largest market represented on the Smaato Ad Exchange

Is Native The Future of Display Advertising?

SPONSORED CONTENT: Olive Media's Shannan LaMorre on a growing marketing arena