Innovation Monitor: Facebook and the research ecosystem
Welcome to this week’s Innovation Monitor.
Launched last year, Ad Observer anonymously collects ad data from users’ Facebook accounts, in order to better understand the spread of disinformation. Here’s their presentation during NYC Media Lab’s 2020 Summit. Earlier this month, Facebook suspended the accounts of NYU researchers Laura Edelson and Damon McCoy behind the Ad Observer extension. To many, this attempt to suppress open research reflects the social media platform’s sense of impunity.
This week, we dive into what happened, what was the internal rationale, and if this is just an example of history repeating itself. The questions this raises are important as we work to better understand the widespread problem of disinformation on digital platforms. What’s good faith, what’s self-regulation, and what’s protecting oneself as a company are all brought up below.
As summer winds down, the Innovation Monitor will go on a brief vacation for the next two weeks, but fret not, we’ll be back in September! Thank you for reading, and as always, if you were forwarded this email, you can easily sign up here!
Erica Matsumoto Privacywashing While Facebook’s decision reinforces the adage that it’s not in the interest of for-profit organizations to hold themselves accountable, it still feels like an affront. Mainly due the company’s “privacywashing,” as EFF’s Rory Mir and Cory Doctorow call it in their response piece. In a statement from Facebook product management director Mike Clark, Facebook justified the act by citing a violation of the company’s Terms:
“Collecting data via scraping is an industry-wide problem that jeopardizes people’s privacy, and we’ve been clear about our public position on this as recently as April. The researchers knowingly violated our Terms against scraping — which we went to great lengths to explain to them over the past year. Today’s action doesn’t change our commitment to providing more transparency around ads on Facebook or our ongoing collaborations with academia.”
In response, Mir and Doctorow wrote:
“Secrecy is not privacy. A secret is something no one else knows. Privacy is when you get to decide who knows information about you. SiFTC’s acting director of the Bureau of Consumer Protection, Sam Levine,nce Ad Observer users made an informed choice to share the information about the ads Facebook showed them, the project is perfectly compatible with privacy. In fact, the project exemplifies how to do selective data sharing for public interest reasons in a way that respects user consent.”
Similar privacywashing tactics have been used to quash external projects that didn’t collect any user data: “Last year EFF weighed in on another case where Facebook abused the CFAA to demand that the ‘Friendly browser’ cease operation. Friendly allows users to control the appearance of Facebook while they use it, and doesn’t collect any user data or make use of Faceboo’’s API. Nevertheless, the company sent dire legal threats to its developers.” Involving the FTC In Clark’s post, he notes that for months, Facebook “attempted to work with New York University to provide three of their researchers the precise access they’ve asked for in a privacy-protected way. We took these actions to stop unauthorized scraping and protect people’s privacy in line with our privacy program under the FTC Order.”
Clark is referring to the FTC’s 2019 consent decree against Facebook, which stemmed from the Cambridge Analytica scandal and came attached with a staggering $5B fine. The caution seems to make sense… except it doesn’t.
The FTC’s acting director of the Bureau of Consumer Protection, Sam Levine, published an open letter to Mark Zuckerberg on August 5, expressing his disappointment at Facebook’s use of the decree as justification for terminating academic research:
“Had you honored your commitment to contact us in advance, we would have pointed out that the consent decree does not bar Facebook from creating exceptions for good-faith research in the public interest. Indeed, the FTC supports efforts to shed light on opaque business practices, especially around surveillance-based advertising.” Protecting the Advertisers So if Facebook didn’t suspend Edelson and McCoy’s accounts for going against the decree, what was their internal reasoning? According to Wired politics writer Gilad Edelman, it was to shield their customers, not their users:
“As Issie Lapowsky reported for Protocol in March, Facebook’s biggest concern may be the advertisers themselves. The company seems to believe that a person or business that pays to target users with ads on Facebook is entitled to a degree of secrecy about it. After all, Facebook could make this whole controversy go away by simply making the data on how ads are targeted public, which would eliminate the need for workarounds like the Ad Observer.”
Joe Osborne, a Facebook spokesperson, acknowledged the decree wasn’t the main reason. According to Wired: “Osborne says, the researchers repeatedly violated a section of Facebook’s terms of service that provides, ‘You may not access or collect data from our Products using automated means (without our prior permission).’” But this is a stretch, according to Edelson:
“Scraping is when I write a program to automatically scroll through a website and have the computer drive how the browser works and what’s downloaded. That’s just not how our extension works. Our extension rides along with the user, and we only collect data for ads that are shown to the user” A Familiar Pattern This is a familiar pattern: Facebook apologizes for harmful behavior on its platform, makes surface-level adjustments, and moves on. In a report this month, Facebook said it removed hundreds of accounts from its platform and Instagram that were tied to anti-vaccination campaigns.
But as author Cecelia Kang told TechCrunch in an interview, the vaccine misinformation has been metastasizing for a decade: “It’s got deep roots across all parts of Facebook. And it’s homegrown. It’s Americans who are spreading this misinformation to other Americans. So it challenges all Facebook’s tenets on free speech and what it means to be a platform that welcomes free speech but also hasn’t drawn a clear line between what free speech is and what harmful speech is.”
A leaked document published by BuzzFeed in April showed that Facebook knew they “failed to take appropriate action to limit the organizing capabilities of ‘Stop the Steal’ groups, and should ‘do better next time.’” But Facebook never intended to make the document public, of course, nor does it intend to make it easy for researchers to “reconstruct the crime scene,” as The Atlantic wrote in June:
“Joan Donovan, the director of the Technology and Social Change Research Project at Harvard Kennedy School…. noted that inviting outside researchers to figure out what happened at Facebook means that universities and nonprofits will have to spend millions of dollars to do what Facebook could easily do itself…. Because so much of the data involved have been deleted, many researchers will also be stuck trying ‘to reconstruct a crime scene in which Facebook has most of the evidence and is unwilling to share.’”
The NYU researcher scenario isn’t unique. Researchers at AlgorithmWatch, a nonprofit that studies how automated decision-making systems impact society, and which also has a browser addon that monitors volunteers’ activities, was “forced to abandon their research project monitoring the Instagram algorithm after legal threats from Facebook,” according to The Verge. This Week in Innovation History
August 19th, 2004: Google holds it’s Initial Public Offering
On this day, the search giant issued 22mm shares to the public at a price of $85. The IPO was a memorable one, not only as the company would go onto becoming one of the most dominant technology firms in the world, but also because the firm utilized a unique “Dutch Auction” share allocation and pricing system. Another unique element was, it was the first S1 form filed for the IPO (the registration form disclosing all financials and business details) that contained a “Founder’s Letter” that defined the firm’s vision. The letter began…”Google is not a conventional company. We do not intend to become one.”
This email was sent to <<Email Address>>
why did I get this? unsubscribe from this list update subscription preferences
NYC Media Lab · 370 Jay Street, 3rd floor · Brooklyn, New York 11201 · USA