TikTok’s damaging impact on teenagers won’t be fixed by content filters alone

TikTok announced last week that it would introduce content filters to prevent minors from viewing material with “overtly mature themes.”

“When we detect that a video contains mature or complex themes, for example, fictional scenes that may be too frightening or intense for younger audiences, a maturity score will be allocated to the video to help prevent those under 18 from viewing it across the TikTok experience,” Cormac Keenan, TikTok’s head of trust and safety, said in a blog post.

Along with introducing “maturity scores,” which the company says will function similarly to TV, film, and gaming industry ratings, TikTok will soon allow all users to block videos from their “For You” and “Following” feeds by word or hashtag.

TikTok has been notorious for platforming the spread of dangerous trends, such as the blackout challenge, which encourages users to strangle themselves until passing out. Multiple children died attempting the challenge when it popped up on their “For You” page, leading parents to sue the company.

The app also exacerbates depression and anxiety by making mental health issues trendy, with teenage girls using it to self-diagnose conditions such as multiple personality disorder, attention hyperactive deficit disorder, or obsessive-compulsive disorder after watching videos suggested to them on these topics. Other girls have developed tics after watching content showing people with Tourette syndrome.

Content filters will go a long way toward mitigating the impact of dangerous and sexually explicit material, especially when it doesn’t take teenagers seeking this content out to be exposed. One in five minors between 12 and 16.5 have been accidently exposed to pornography. Other studies put the number higher, with half of teenagers in a Barna study saying they “come across” porn at least once a month.

Yet the dangers adolescents face on social media are often beyond content filters. While companies bear part of the blame, they cannot be responsible for it all. The greater responsibilities rest with parents who allow their children to use the apps.

Even if every “overtly mature” video were wiped from TikTok and all other platforms, problems would persist. Social media is intrinsically addictive and prone to producing unhealthy behavior. When adults struggle with self-image, comparison, and wasting time online, children who are still developing will be even more susceptible.

Parents should think hard about what kind of restrictions they will place on social media use and whether their children really need access to these platforms at all.

Related Content