FB1

Chatter: Learning from Facebook’s research misstep

Understanding the big issues at play in Facebook's experiment

facebook-254165_640

Facebook’s been in a lot of bad headlines lately, and for once it’s got nothing to do with the NSA. The social network conducted an experiment on nearly 700,000 users, subtly modifying their newsfeeds over the course of one week in 2012 to show more positive or negative statuses, and logging whether their own statuses became more positive or negative. The result? A lot of angry Facebook users who felt they’d been manipulated without their consent.

(Also: an FTC complaint, a review by the UK Information Commissioner’s Office, and an apology from COO Sheryl Sandberg.)

For the digital advertising industry — which is in some ways a giant experiment with big data and consumer privacy — the controversy was too close for comfort. So here are a few takeaways to help marketers understand the issues at play and avoid Facebook’s mistake.

Duncan J. Watts, principal researcher at Microsoft, in The Guardian
“[W]e are being manipulated without our knowledge or consent all the time – by advertisers, marketers, politicians – and we all just accept that as a part of life. The only difference between the Facebook study and everyday life is that the researchers were trying to understand the effect of that manipulation.

“If that still sounds creepy, ask yourself this: Would you prefer a world in which we are having our emotions manipulated, but where the manipulators ignore the consequences of their own actions? What about if the manipulators know exactly what they’re doing … but don’t tell anyone about it? Is that really a world you want to live in?”

Alex Hern, tech reporter at The Guardian
“[F]or once, the discomfort with what Facebook is doing isn’t about its ongoing war with Google over who can know the most about our lives… I think what we’re feeling, as we contemplate being rats in Facebook’s lab, is the deep unease that what we think of as little more than a communications medium is actually something far more powerful. Facebook has so far successfully presented an image of neutrality … But with this study, Facebook has inadvertently highlighted, to a mostly uncaring community, just how actively it interferes. Up until now, it’s mostly shown off the times when it’s done so with beneficent aims: promoting organ donors, or voters. Now, that mask has dropped.”

Navneet Alang for The Globe and Mail
“[N]ow algorithms are in everything: not just Google searches, but in determining what shows up in your Facebook news feed, what recommendations you get on Netflix or Rdio – or, crucially, what kind of ads you might like to see. … In a sense, then, we are all involved in a grand experiment, as technology companies and entrepreneurs throw things at a wall to see what will stick. If Facebook experimented with the spread of sentiment, tech companies at large are experimenting with how humans socialize, gather information, relate to the physical world and more.”

Nilay Patel for Vox
“[N]o one’s mad about Taco Bell buying ads or promoted posts in Facebook’s News Feed, even though ads are designed to change our emotions. (Just wait until election season really heats up.) What we’re mad about is the idea of Facebook having so much power we don’t understand — a power that feels completely unchecked when it’s described as “manipulating our emotions.” Advertisers paying to change our feelings about products feels clean and familiar; Facebook screwing with your mind just to see what happens feels like playing god. But in the grand scheme of the internet economy, Facebook running a badly-designed test on a tiny fraction of its billion and a half users for one week in 2012 is virtually meaningless when the company is building an entire business around #brand #engagement with #audiences every other day of the year.”

Puneet Mehta, CEO of MobileROI, quoted in Advertising Age
“Most marketing research is opt-in, which is the main issue with this Facebook research, not the delivery mechanism (algorithm-based optimization). Such programs should not just be opt-out but truly opt-in only. Moreover, my results should be shared with me. If I don’t feel good about my results becoming a part of the overall pool, I should also be allowed to withdraw my data at any time.”

Advantage Articles

TD launches Apple Pay in the U.S.

Bank says U.S. rollout will help prep for eventual Canadian launch

ANA finds just 11% bot fraud, but study authors have doubts

Security firm says fraudsters may have been ready for them

Square looks to lock down Canadian small businesses

Mobile payment provider creates new roles to build its brand

Turney appointed chair of ICA board of directors

MacLaren McCann CEO says industry is in a state of confusion

Meet Canada’s new outdoor mobile beacon network

Freckle, a startup by the founder of Juice Mobile, breaks radio silence

Rebecca Shropshire leaves CBC for Juice Mobile

Ad vet to lead firm's sales in Canada

Native ads perform better when clearly identified (Study)

Data from Polar shows transparency makes good business sense