Read Full Story Click Here
The 2020 race and the spread of Covid-19 are testing the boundaries for political ads. Here’s how big tech companies decide what can–and cannot–appear on their platforms.
By Patience Haggin and Emily Glazer
Published June 4, 2020 at 11:00 a.m. ET
Political campaigns and advocacy groups will funnel billions of dollars into digital advertising during the 2020 election season. With virtually no federal guidelines regulating these ads, the major online platforms are drawing up their own rules for what political advertisers can and cannot do.
Platforms’ decisions to moderate content have sparked widespread debate, as with Twitter Inc.’s decision in May to flag tweets posted by President Trump that violated its rules and Facebook Inc.’s decision to allow similar posts to appear without moderation.
Facebook, Alphabet Inc.-owned Google and Twitter updated their ad policies in 2019, setting rules for what counts as a political ad, who is allowed to buy them and what they can say. All three platforms use a combination of automated and human review to enforce their policies. The following uses real ads as examples to explain how the three platforms diverge.
Google and Facebook declined to comment on ads that didn’t run on their own platforms. Twitter confirmed our reporting about how the site would handle certain ads.
Note: Some images in this article may not appear to users with ad-blockers installed.
What counts as a political ad?
Google allows ads:
about or bought by current officeholders or candidates for an elected federal or state office
about or bought by a federal or state political party
about a state ballot measure
Facebook allows ads:
about, bought by or on behalf of a current or former candidate for public office, a political figure or a political party
about any election, referendum or ballot initiative
about “social issues”
Twitter allows “cause-based ads” but bans ads or promoted tweets:
about a candidate, political party, elected or appointed government official, election, referendum, ballot measure, legislation, regulation, directive or judicial outcome
made by a candidate, political party, elected or appointed government official, PAC, super PAC, 501c4
What about unsubstantiated claims?
Several Facebook groups bought ads spreading “Plandemic,” a documentary filled with unsubstantiated claims about the Covid-19 pandemic, including conspiracy allegations against the leader of the government’s pandemic response. Fact-checking groups PolitiFact and FactCheck.org labeled these claims as false. All three platforms have removed the “Plandemic” video from posts.
Facebook has removed ads promoting the documentary purchased by groups such as the Published Reporter, Citizen Media, Last Resort Podcast and the Chattanooga Tea Party.
The Chattanooga Tea Party’s president said Facebook tries to have it both ways: it accepted the group’s advertising dollars but didn’t take down the video until after the campaign ran its course. A Facebook spokesman declined to comment on that allegation. The Published Reporter confirmed it promoted the Plandemic video and found it “unfortunate” that Facebook is silencing its opinion. The Last Resort Podcast promoted a commentary video discussing the Plandemic video, which was also removed temporarily from YouTube, and expressed similar sentiments about tech platforms' ability to censor free speech. Citizen Media didn’t respond to requests for comment.
Dr. Judy Mikovits, who appears in “Plandemic,” defended the film as accurate. The film’s production company didn’t respond to requests for comment.
Google wouldn’t allow this ad. Its policy doesn’t allow ads with misleading claims.
Facebook removed this ad. Facebook’s policy doesn’t allow content that contributes to the risk of imminent violence or physical harm. The video’s claim that wearing a mask could make you sick could contribute to physical harm, explained a Facebook spokesman. Facebook also classified this as a political ad.
Twitter wouldn’t allow this under its policy banning misrepresentative content.
The Trump campaign bought ads on Google-owned YouTube and on Facebook claiming that former Vice President Joe Biden promised Ukraine $1 billion in U.S. aid to fire a prosecutor looking into a gas company with ties to his son. Fact-checking groups PolitiFact and FactCheck.org labeled these claims as false. A Trump campaign spokesman defended the ad’s claims.
Google allowed this ad. A Google spokeswoman reiterated that the ad complies with Google’s policies.
Facebook allowed a nearly identical ad. Facebook’s policy is not to fact-check ads or content by political candidates.
Twitter wouldn’t allow this ad because it is about an election.
Sen. Elizabeth Warren’s presidential campaign bought a Facebook ad claiming that Facebook CEO Mark Zuckerberg endorsed President Trump’s re-election. The ad acknowledges that this claim is false.