TikTok is making tweaks to how it recommends videos on the app in attempt to make sure it doesn’t inadvertently reinforcing negative experiences.
On Thursday, the popular social video app said it’s testing changes to avoid showing people too much of one type of content that when viewed as a single video might be fine but that could be “problematic if viewed in clusters,” such as content dealing with extreme dieting or fitness, sadness or breakups.
“We’re also working to recognize if our system may inadvertently be recommending only very limited types of content that, though not violative of our policies, could have a negative effect if that’s the majority of what someone watches, such as content about loneliness or weight loss,” the company wrote in a blog post.
For years, social media platforms like TikTok and Instagram have been criticized for harboring harmful content and fostering anxiety and depression, particularly among teens and younger audiences. In October, TikTok alongside YouTube and Snap faced scrutiny from lawmakers during a Senate hearing focused on keeping kids safe online.
TikTok on Thursday said it’s also working on a feature to let people block videos associated with words or hashtags they don’t want recommended in their For You feed. In September, TikTok released mental health guides and resources designed to support people with negative body image or eating disorders.
#TikTokaddictionresearchpaper #TikTokaddictionsymptoms #ResearchpaperonTikTokPDF