TikTok Censored ‘Ugly, Poor or Disabled’ People to Attract More Users
As The Intercept reports, TikTok, which is owned by the Chinese company ByteDance, moderators looked for users that had an “abnormal body shape,” “ugly facial looks,” had an “obvious beer belly” and many other aesthetic qualities, in order to stop them reaching the “For You” section of the app — the landing page users come to when they open the app which algorithmically promotes content. The internal documents state that “the only focus of the video [is] if the character’s appearance or the shooting environment is not good, the video will be much less attractive, not worthing [sic] to be recommended to new users.”
In order to ensure TikTok users continually upload content that meets the company’s standards, the most popular users are reached out to directly through video calls. According to a “moderation source” who spoke to The Intercept, this is to ensure that influencers and official content creators will not “creat[e] videos that go against what [ByteDance] think is right.”
ByteDance doesn’t want content that will harm “national honor.” Any content breaching this nebulous definition resulted in a permanent ban from the app, as was the “uglification or distortion of local or other countries’ history.” In the documents, the “Tiananmen Square incidents” are mentioned as only one of three real world examples of content that would be removed. Users were also censored for “defamation … towards civil servants, political or religious leaders.”
The revelations in these documents are similar to those published previously by Netzpolitik, which reported on how the app was censoring LGBTQ users, as well as The Guardian, which discovered that TikTok was instructing censorship of “videos that mention Tiananmen Square, Tibetan independence or the banned religious group Falun Gong.”
In a statement to The Intercept, a TikTok spokesperson said that many of the practices described in the documents “are either no longer in use, or in some cases appear to never have been in place.” The rules suppressing unattractive, disabled and poor users “represented an early blunt attempt at preventing bullying, but are no longer in place” and were apparently not in place when they were discovered.