BETA
This is a BETA experience. You may opt-out by clicking here
Edit Story

Ahead Of Midterms, Researchers Say Facebook Can’t Reliably Identify Political Ads

With the 2022 midterm elections fast approaching, and concerns about election misinformation growing, Facebook's parent company Meta is again touting its Ad Library as an “industry leading” transparency tool that will mitigate them. The Ad Library, which allows members of the public to see who placed certain ads and how much money they spent on them, has been used extensively by researchers, and has even been cited as an example for other companies to follow.

But a paper presented by researchers at NYU and KU Leuven last week at the USENIX Security Symposium suggests that the ad library may be deeply flawed.

The researchers found that most of the time, Facebook incorrectly categorized political ads that advertisers did not label themselves. “Facebook misses more ads than they detect, and over half of those detected ads are incorrectly flagged,” the researchers wrote. Meta disputes this characterization, because the research did not look at ads blocked by Facebook before they ever run on the platform. Facebook does not make those ads available to researchers or the public.

November 2022 will be the first federal election since an armed mob — fomented in part on Facebook — stormed the Capitol in an attempt to overthrow the last presidential election. But even after the insurrection, Meta faced criticism for failing to enforce its advertising policies about political violence. And while other platforms, including Twitter and TikTok, have opted to ban political ads, Meta continues to sell them, prohibiting their purchase for just a week before the actual election. (Disclosure: In a former life, I held policy positions at Facebook and Spotify.)

The paper, which was first released in late 2021, identifies both significant overenforcement and underenforcement of Facebook’s political advertising rules. In a sample of 189,000 ads from July 2020 and February 2021 that were deemed political under the company’s policies, researchers determine that Facebook made an incorrect determination 83% of the time, either by flagging “false positive” non-political ads as political, or by failing to flag “true positive” political ads running without a disclaimer.

Andy Stone, a spokesperson for Meta, said in a statement: “The study itself acknowledges that ‘Facebook’s initial ad review already catches most violations,’ that half the ads our review initially missed were caught ‘rather quickly,’ and the ones that weren’t caught represent less than three percent of the 4.2 million political ads the study’s authors claim ran on the platform.” According to the researchers, 4.1 million of those 4.2 million political ads were designated as political by advertisers themselves.

Meta has had a rocky relationship with the NYU research team, because it uses a scraper to collect data from the Ad Library, and Meta forbids scraping. Weeks before the 2020 election, Facebook threatened legal action against the researchers, arguing their work infringed on user privacy. The platform subsequently banned them. In the 2022 paper, the NYU team clarifies that its data collection “does not affect any non-advertiser Facebook users” and “do[es] not observe any personally identifiable information on them.” Undeterred by the ban, the researchers launched a new monitoring hub for the 2022 election cycle earlier this month.

They also issued a set of recommendations about how Meta could more accurately detect political ads. Damon McCoy, one of the researchers, told Forbes that one of the simplest of those recommendations was that Meta should focus more on the person who placed the ad, and their affiliations, as opposed to scanning the content of each individual ad.. Facebook requires that all ads placed by candidates, political parties and PACs be marked as political. But McCoy referenced several misses on the platform, including ads by Joe Biden and the National Rifle Association.

For some countries, McCoy said, “detection could be greatly improved with just a little bit of work collecting lists of political candidates and political groups.” Meta’s Stone said this work was ongoing, especially in the run-up to elections in individual countries.McCoy also told Forbes that in the leadup to the 2020 election, his research team repeatedly saw US political advertisements from the Chinese state media newswire Xinhua News running without a political ad disclaimer. They flagged the content to Facebook, but according to McCoy, a Facebook employee told them that the company had spoken with Xinhua News and been assured that the issue would not occur again. After multiple such alerts from the researchers, Facebook finally prohibited all US political ads by foreign state media entities in June 2020.

Facebook has also made one other change to its policies since the 2020 election cycle: It now “might” allow “commercial advertisers” to refrain from labeling ads about social issues, if the core ad is about a service or product. (The exception does not cover ads that explicitly mention politics or elections.) This exception could reduce the prevalence of overenforcement against companies, which the researchers found was quite common. But it could also lead to meaningful underenforcement. Now, advertisers can theoretically circumvent transparency requirements by making their ad that says “abortion is murder” — which would require a disclosure — into an ad for a t-shirt that says “abortion is murder,” which no longer necessarily does.

Stone said it would not be accurate to assume an “abortion is murder” t-shirt would be exempt from authorization and disclosure requirements, and noted the company’s policy regarding merchandise ads is to assess the primary purpose of the ad.

The problem, though, is more than theoretical: within minutes of perusing the Ad Library, Forbes found four current ads, running without disclaimers, that appeared to have a primary purpose of social issues advocacy.

One, an ad for a t-shirt that says “Put The FBI On The FBI Watch List,” was promoted by the conservative provocateur Dinesh D’Souza, who is most recently known for receiving a pardon from President Donald Trump and making a documentary challenging the legitimacy of the 2020 presidential election. The text of the ad said: “When freedom is being challenged in all conceivable ways, true PATRIOTS will not sit down and just let such come to pass.”

Two others related to abortion: one said “It’s Not About Abortion. It’s About Control.” It followed in much smaller text with a link to buy pro-choice merch. Another, announcing “YOU CAN SAVE BABIES!” encouraged people to sponsor other pregnant people’s ultrasounds to encourage them not to choose abortion.After reviewing the images, Stone said, “They should have had disclaimers and we’ve enforced against them.”

Update: This story has been updated to clarify the timeline of events between Facebook and the NYU research team.

Follow me on TwitterSend me a secure tip