Teenagers spent an average of two hours a day on TikTok in 2023, data reveal

Data show that teenage TikTok users spent an average of two hours a day on the app in 2023, revealing the social media network has only grown in prevalence among the youth even as it has faced mounting skepticism from the public and Congress.

A report Wednesday from parental control software developer Qustudio found that 18-and-under users in the United States were viewing TikTok 120 minutes a day in 2023, a steady increase from 113 minutes in 2022.

Instagram saw similar growth. Youth in the United States used the app 65 minutes a day, a significant increase from 43 minutes in 2022.

This increased amount of time spent on the platforms coincided with growing efforts from parents, lawmakers, and regulators in 2023 to limit social media use by youth.

Big Tech companies “are up against some serious social and governmental change primarily aimed at the youngest in society,” Qustudio wrote in their report. “Despite this, social media apps remain just as appealing as they ever were among children and teens, maintaining similar popularity levels year over year, or in some cases, growing to new levels.”

YouTube remained the most popular video app for youth in the U.S. in terms of the total number of users. Sixty percent of U.S. teenagers reported using YouTube, compared to 44% claiming they use TikTok. However, the amount of minutes spent on YouTube was notably less than the shortform-focused TikTok. Younger users spent 84 minutes a day on average on YouTube in 2023, which was up from 77 minutes a day in 2022.

State and federal lawmakers increased pressure on companies such as Meta, TikTok, and YouTube in 2023 as parents began to report problems with teenage mental health. The CEOs of Meta, X, TikTok, Snap, and Discord are scheduled to appear before the Senate Judiciary Committee on Jan. 31 to address the impact that their apps have had on youth mental health.

Meta announced on Jan. 9 that it would place accounts of under-18 users under the most restrictive content settings in order to protect teenage mental health, a change that appears to be made preemptively in preparation for the pressure that the company faces from legislators and the courts.

Sens. Marsha Blackburn (R-TN) and Richard Blumenthal (D-CT) have introduced the Kids Online Safety Act, which advanced at the committee level and awaits a floor vote, which would require that social media platforms, such as Facebook or YouTube, provide minors options to disable “addictive” product features and opt out of algorithmic recommendations in favor of chronological formats. It would also mandate that the platforms enable the strongest privacy settings by default, prevent “harmful” content from being displayed, and undergo annual independent audits of risks to minors.

More than 40 states sued Meta in the U.S. District Court for the Northern District of California in October, alleging that Meta hid the amount of damage that its apps had caused to teenagers through the promotion of addictive behavior and promotion of harmful content. New Mexico also filed a suit against Meta in December, accusing it of hosting a “marketplace of predators” and failing to do enough to crack down on the sale of child sexual abuse material.

Utah’s attorney general sued TikTok in October, alleging that the app was causing child addictions and targeting young users with its algorithms.

CLICK HERE TO READ MORE FROM THE WASHINGTON EXAMINER 

At least four states have attempted to restrict teenage access to social media by requiring the platforms to verify a user’s age through copies of IDs or other means. The tech advocacy group NetChoice filed suits against age verification laws in CaliforniaArkansas, Ohio, and Utah and succeeded in obtaining preliminary holds in all four states.

Other countries are also increasing pressure on Big Tech companies. France approved a law in June that will require platforms to verify the age of users to see if they’re 15 and under. The United Kingdom also passed the Online Safety Bill, comprehensive legislation that will require websites to take additional actions to address underage access to online pornography, anonymous trolling, scam ads, the sharing of harmful AI-generated images, and the spread of child sexual abuse material.

Related Content