TikTok removed almost 350,000 videos for spreading election misinformation

Karissa Bell
·Senior Editor
·2 min read

TikTok is offering a new glimpse into just how much misinformation is on its platform. Between July and December of last year, the app removed hundreds of thousands of videos for breaking its rules around misinformation about the 2020 presidential election and the coronavirus pandemic. Details of the takedowns were released as part of the company’s latest transparency report.

Unsurprisingly, election misinformation was the most prevalent. The company removed 347,225 videos for sharing election misinformation or manipulated media, according to the report. An additional 441,000 clips were removed from the app’s recommendations because the content was “unsubstantiated.” (Like Facebook, TikTok works with third-party fact checking organizations; the company also warns users when videos contain “unverified” claims.”)

During the same period, TikTok took down 51,505 videos for sharing misinformation about COVID-19. In its report, TikTok notes that 87 percent of these clips were removed within 24 hours of being posted, and that 71 percent had “zero views” at the time they were removed.

The new stats come after TikTok tightened its policies around misinformation ahead of the election. In the lead-up to the 2020 election, the company introduced new rules barring deepfakes and expanded its work with fact checking organizations to debunk false claims. The app also added in-app notices to direct users to credible information. TikTok says its PSAs were viewed more than 73 billion times.

In its report, TikTok says it was well-prepared for the election, and that much of the misinformation was from domestic sources within the United States. “We prepared for 65 different scenarios, such as premature declarations of victory or disputed results, which helped us respond to emerging content appropriately and in a timely manner,” TikTok writes. “We also prepared for more domestic activity based on trends we’ve observed on how misleading content is created and spread online. Indeed, during the US 2020 elections, we found that a significant portion of misinformation was driven by domestic users –– real people.”

The company also notes that misinformation and disinformation represents only a fraction of the total content TikTok removes. The app took down more than 89 million videos that broke its rule, according to the report. As with its previous report, the biggest category of takedowns were around “minor safety” (36 percent of removals) and adult nudity (20.5 percent). “Integrity and authenticity,” which includes misinformation as well as things like bots and fake accounts, represented 2.4 percent of TikTok’s takedowns.

But even with a relatively low amount of misinformation, TikTok has still at times struggled to contain viral misinformation. In the days after the election, viral videos spreading debunked conspiracy theories about voter fraud racked up hundreds of thousands of views before they were removed, according to Media Matters.