With less than a week until Election Day, Facebook has admitted to a glitch in the system that handles political ads on its platform. “Technical flaws” related to a new transparency effort that restricted new political ads from appearing on Facebook in the week before the election caused an unstated number of old political ads to not appear at all. The Biden and Trump campaigns both say some of their ads were among them.
This looks bad for Facebook. While the company says it has mostly fixed the problem — and that the issue had nothing to do with partisanship — the situation highlights a growing distrust in Facebook’s ability to manage political content on its platform. Facebook says the moratorium on new political ads that led to the glitch was part of its “efforts to ensure maximum transparency.” The Biden campaign says Facebook let them down.
“We have no sense of the scale of the problem, who it is affecting, and their plan to resolve it,” Biden’s digital director Rob Flaherty said in a statement Thursday night. “It is abundantly clear that Facebook was wholly unprepared to handle this election despite having four years to prepare.”
This is just the latest in a series of episodes that raise questions about Facebook’s commitment to transparency in its handling of political ads, including objectionable content found in and the opaque targeting of its political ads. It’s also not clear to many users how they’re being targeted by such ads. Facebook has previously taken steps to block ad tracking tools, including one built by New York University researchers, one from ProPublica, and another from Mozilla. So some worry that Facebook’s stated commitment to ad transparency is an empty promise and that the platform has failed to moderate itself successfully.
“Every week, there is new bad stuff that gets through Facebook’s own monitoring and screening,” Laura Edelson, a researcher at NYU studying political ads, told Recode. “The real danger is that Facebook says it can do this job themselves, but they can’t.”
Last month, Edelson and her colleagues at NYU launched a project called the Ad Observatory that, in part, allows users to download a browser extension designed to record information about the political ads they saw on the platform. The browser extension, which is called Ad Observer, “allows journalists and researchers to better understand the political misinformation and manipulation that spreads daily on your platform,” the group said.
But on October 16, Facebook sent the NYU project a cease-and-desist letter, demanding that they stop before the end of November. This led a slew of organizations led by Mozilla to call that Facebook withdraw its letter and work with the researchers on improving political ad transparency.
Facebook claims that it provides transparency with its Ad Library, which the company built in response to demand for information about promoted campaigns on its platform. This searchable database shows information about active and inactive ad campaigns being run on Facebook, including the amount spent as well as the ages, genders, and locations of people who end up seeing an ad. However, the conflict between Facebook and researchers at the Ad Observatory project suggests that users don’t know much about why they see certain political ads. Technical problems with the Ad Library are also the reason why an unstated number of previously approved political ads did not run the week before the election.
As for the Ad Observer tool itself, Facebook says the browser extension engages in bulk data collection, which is a violation of the company’s terms of service. The cease-and-desist letter also said that if the researchers don’t shut down the tool voluntarily, they “may be subject to additional enforcement action.” In fact, the company says it told the researchers months earlier that such a tool would go against its rules. It also demanded that all the data collected by the project be deleted.
“We informed NYU months ago that moving forward with a project to scrape people’s Facebook information would violate our terms,” Facebook spokesperson Joe Osborne said in a statement to Recode. “Our Ad Library, which is accessed by more than 2 million people every month, including NYU, already provides more transparency into political and issue advertising than TV, radio or any other digital ad platform.”
But while the Ad Library does reveal some general details about impressions — like where ads ended up being shown and the gender breakdown of those who saw an ad — researchers say that’s not enough. Many argue that the tool lacks pivotal details about how those ads were actually targeted, and some claim that not all political ads make it into Facebook’s library tool.
The recent back-and-forth with Facebook has actually led to a surge in participation in the NYU project. Since people learned that Facebook had sent a cease-and-desist letter to the researchers, thousands more volunteers have downloaded the Ad Observer browser extension. The number of participants is now more than 13,000, which is double the 6,500 who had signed up before Facebook’s cease-and-desist letter.
Meanwhile, the NYU researchers say the tool does not collect personally identifiable information and that the data of all the participants is anonymized and combined. “No personal information from volunteers is collected,” says the Ad Observatory’s website, which specifies that its tool does not collect “anything personally identifying,” including names, birthdays, friend lists, or ad interactions.
Some employees at Facebook have suggested that the tool is not safe. Facebook did not share whether it had any plans to reveal any new information about targeting.
Sen. Amy Klobuchar (D-MN) asked Facebook CEO Mark Zuckerberg during a recent Senate hearing on Section 230 about political advertisements on the platform. This is a topic Klobuchar has been following as a co-sponsor of the Honest Ads Act, which would require tech companies to reveal more information about how political ad targeting works. While the law hasn’t passed, she’s asked Zuckerberg to meet its standards for fully disclosing which groups are targeted by particular political ads.
In a statement to Recode, Klobuchar told Recode that technology companies, including Facebook, have not met those standards, and she condemned recent reports about the company trying to squash research.
“As we face threats to our democracy, we need more transparency, not less,” Klobuchar said.
Other social platforms have recently become more scrupulous about political ads. Major platforms like Twitter, Nextdoor, and TikTok have banned all political ads. But Facebook has doubled down on its controversial policies by refusing to fact-check political ads. There were some related political ad controversies earlier this year, including the company allowing the Trump campaign to run hundreds of misleading ads related to the census as well as ads that contained Nazi imagery. Facebook has since added the option for users to turn off political ads.
In an email, NYU’s Edelson told Recode that she fully supported Facebook’s recent political ad policy changes, like banning new political ads in the week before Election Day.
“However, it’s now clear that not only has the communication about these policies been haphazard and confusing, but the implementation has been as well,” Edelson said. “If Facebook wants to rebuild trust with both users and advertisers, they need to be much more transparent about how political advertising works on their systems.”
The most recent hiccups in Facebook’s political ad system show that some — from academics to presidential campaigns — remain concerned about the company’s transparency efforts. In a sense, Facebook is making changes that have led to more problems and unintended confusion. So it’s worth wondering, days before a pivotal election that the company has known about for years, why Facebook still doesn’t seem prepared.
Update October 30, 1:10 ET: This piece has been updated with an additional comment from Edelson.
Open Sourced is made possible by Omidyar Network. All Open Sourced content is editorially independent and produced by our journalists.
Millions turn to Vox each month to understand what’s happening in the news, from the coronavirus crisis to a racial reckoning to what is, quite possibly, the most consequential presidential election of our lifetimes. Our mission has never been more vital than it is in this moment: to empower you through understanding. But our distinctive brand of explanatory journalism takes resources. Even when the economy and the news advertising market recovers, your support will be a critical part of sustaining our resource-intensive work. If you have already contributed, thank you. If you haven’t, please consider helping everyone make sense of an increasingly chaotic world: Contribute today from as little as $3.