New TikTok guidelines out
Popular video-sharing app TikTok issued a broad ban against “misleading information” that could cause harm to its community or the public, thus setting itself apart from rivals like Facebook which say that they do not want to be arbiters of truth.
“We remove misinformation that could cause harm to an individual’s health or wider public safety. We also remove content distributed by disinformation campaigns,” TikTok, owned by Chinese tech company ByteDance, said in new guidelines which expand and add detail to its earlier rules.
According to data from research firm Sensor Tower, TikTok and its Chinese counterpart Douyin, have been downloaded more than 1.5 billion times, including 680 million downloads in 2019.
TikTok’s previous rules around “misleading content” appeared to focus mostly on scams, barring users from creating fake identities or posting false information to make money, but did not mention misinformation or disinformation campaigns.
By contrast, the new rules explicitly ban “misinformation meant to incite fear, hate, or prejudice,” “misleading information about medical treatments” and “content that misleads community members about elections or other civic processes.”
The guidelines did not explain how TikTok would determine what constitutes “misleading” content.
On Monday, Facebook announced a new policy banning deepfakes and other manipulated media.