TikTok has also developed a moderation system API that it plans to make available this fall at its virtual Transparency and Accountability Center hub. The moderation system API will give select researchers a way to evaluate TikTok’s content moderation systems and examine existing content on the app. In addition, researchers will be able to upload their own content to see how different types of content are either permitted, rejected or passed to moderators for further evaluation.
TikTok will also give the independent experts on its U.S. Content Advisory Council API access, along with access to confidential information, such as its keyword lists, which are used to help detect and flag potentially violative content. TikTok says these changes will give the experts the ability to conduct deeper analyses. Lastly, TikTok will publish insights about the covert influence operations it identifies and removes from its platform globally in its quarterly Community Guidelines Enforcement Reports.
“These initiatives are well underway and will be launching over the coming months this year,” said Vanessa Pappas, the chief operating officer at TikTok, in a blog post. “We’ll update on our progress as we continue to innovate when it comes to being transparent and accountable.”