In the last election for the Bundestag, the German Chamber of Deputies, TikTok accounts surfaced from individuals posing as prominent political figures. In Colombia, posts on TikTok attributed a quote from a candidate to a comic book villain and let a woman impersonate another candidate’s daughter. In the Philippines, TikTok videos spread myths about the country’s former dictator that presented him in a false positive light and helped his son win the presidential race.
Similar problems now reach the United States.
On the eve of the midterms — the legislative elections that take place in the middle of the presidential term —, scheduled for November, TikTok is already becoming one of the biggest points of origin of misleading and false information, in many ways as problematic as Facebook and Facebook. Twitter, say researchers who monitor fake news online.
That’s because the same qualities that allow TikTok to fuel dances that go viral — the platform’s tremendous reach, the short length of the videos, its powerful but opaque recommendation algorithm — also make it difficult to challenge false claims.
Unsubstantiated conspiracy theories about alleged voter fraud taken for granted in November are widely seen on TikTok, which has more than a billion active users worldwide every month. Users cannot search for the hashtag #StopTheSteal, but #StopTheSteallll racked up nearly 1 million views until TikTok disabled the hashtag, after being contacted by The New York Times.
Some videos urged viewers to vote in November and cited debunked rumors raised during congressional hearings about the Capitol Hill attack in January of last year. Posts to TikTok have received millions of views for claiming, without evidence, that predictions of an increase in Covid cases next fall are an attempt to discourage in-person voting.
The spread of misinformation has left TikTok grappling with many of the thorny issues surrounding free speech and content moderation that Facebook and Twitter have grappled with for several years and have grappled with with mixed results.
But the challenge is perhaps even more difficult for TikTok to face. Videos and audios, which make up the majority of content shared in the app, can be much more difficult to moderate than text, especially when their tone is ironic. Many in Washington speculate that perhaps the business decisions over data and moderation of TikTok, owned by Chinese tech giant ByteDance, are influenced by its roots in Beijing.
“When you have very short videos with extremely limited text content, there is no space or time for nuanced discussions about politics,” says Kaylee Fagan, a researcher in the Technology and Social Transformation Project at the Shorenstein Center at the Harvard Kennedy School.
TikTok had barely arrived in the United States at the time of the 2018 midterms. During the 2020 presidential election, it was still widely seen as an entertainment app for teens and teens.
Today, its American users spend an average of 82 minutes a day on the platform, three times more time than on Snapchat or Twitter and twice as much as on Instagram or Facebook, according to a recent study by app analytics company Sensor Tower. TikTok is gaining increasing importance as a destination for political content, often produced by influencers.
The company insists it is committed to fighting false information. According to a report it released last year, in the second half of 2020 it removed nearly 350,000 videos that included electoral disinformation and manipulated media. The platform’s filters would have prevented another 441,000 videos containing unproven claims from being recommended to users.
The service blocked so-called deepfake content and coordinated disinformation campaigns ahead of the 2020 election, making it easier for users to report false election information, and has partnered with 13 fact-checking organizations, including PolitiFact. Researchers like Fagan said that TikTok went to great lengths to eliminate problematic search terms. But it’s still easy to get around your filters using creative spelling, as was the case with the hashtag #StopTheSteallll.
“We take our responsibility to protect the integrity of our platform and elections very seriously,” TikTok said in a statement. “We continue to invest in our policy and security teams to combat electoral disinformation.”
TikTok has also struggled to contain non-political disinformation in the United States. Myths about Covid vaccines and masks are rampant, as are rumors and fake news about diets, pediatric issues, and gender-affirming care for transgender people. A video falsely claiming that the massacre at Robb Elementary School in Uvalde, Texas in May was staged attracted over 74,000 views before being taken down by TikTok.
TikTok posts about the Ukraine War have also been problematic. Even experienced journalists and researchers who analyze posts on the platform have a hard time distinguishing what is true from what is rumor or fabrication, according to a report by the Shorenstein Center released in March.
Researchers concluded that TikTok’s design makes the app a breeding ground for misinformation. They claim that videos can easily be manipulated and reposted on the platform and shown alongside stolen or original content. The use of pseudonyms is common; comic or parodic videos are easily interpreted as being factual; Popularity affects the visibility of comments, and data about the date and time a post was published and other details are not shown clearly in the mobile app.
(Researchers at the Shorenstein Center point out, however, that TikTok isn’t as vulnerable as platforms like Twitter or Facebook to so-called “brigading,” in which groups coordinate online to disseminate a post widely.)
According to TikTok, in the first quarter of 2022, more than 60% of videos that contained harmful misinformation were viewed by users before being removed. Last year, a group of behavioral scientists who had worked with TikTok said an effort to pin warnings to posts with unproven content reduced their sharing by 24% but only limited views to 5%.
Researchers say disinformation will continue to thrive on TikTok as long as the platform refuses to release data about the origins of its videos and share information about its algorithms. Last month, TikTok said that this year it will offer some access to a version of its app programming interface, but did not say whether it will do so before midterms.
Filippo Menczer, professor of informatics and computer science and director of the Social Media Observatory at Indiana University, says he proposed research collaborations to TikTok, but heard from the company “not at all.”
“With Facebook and Twitter at least there’s some degree of transparency, but in the case of TikTok we don’t have a clue,” he says. “If we can’t access data, we don’t know who is suspended, what content is removed, whether they act in response to reports or what the criteria are for doing so. It’s completely opaque. We can’t assess anything independently.”
Faced with renewed fears that the company’s ties to China could make it a threat to American national security, US lawmakers are also asking for more information about TikTok’s operations. The company has already said that it intends to store data about US users separate from its Chinese parent company. She also said her rules had changed since she was accused of censoring posts seen as contrary to Beijing’s political goals.
The company declined to say how many human moderators it employs to work alongside its automated filters. (A TikTok executive told British politicians in 2020 that the company had 10,000 moderators around the world.) But former moderators have complained about difficult working conditions, saying they worked too few for the workload and that they sometimes they had to review videos that used languages and references they didn’t know — an echo of accusations made by moderators on platforms like Facebook.
Election season can be especially difficult for moderators, because political posts on TikTok tend to be posted by a diffuse selection of users talking about broad issues rather than specific politicians or entities, says Graham Brookie, senior director at the Research Lab. Atlantic Council Digital Forensics Research.
“The fact is that all platforms can and need to do more to defend the real common facts on which social democracy depends,” he says. “TikTok stands out for its size, its very rapid growth and the number of unresolved issues with the way it makes its decisions.”