TikTok was aware that its design features are detrimental to its young users and that publicly touted tools aimed at limiting kids’ time on the site were largely ineffective, according to internal documents and communications exposed in lawsuit filed by the state of Kentucky.
The details are among redacted portions of Kentucky’s lawsuit that contains the internal communications and documents unearthed during a more than two year investigation into the company by various states across the country.
Kentucky’s lawsuit was filed this week, alongside separate complaints brought forth by attorneys general in a dozen states as well as the District of Columbia. TikTok is also facing another lawsuit from the Department of Justice and is itself suing the Justice Department over a federal law that could ban it in the U.S. by mid-January.
The redacted information — which was inadvertently revealed by Kentucky’s attorney general’s office and first reported by Kentucky Public Radio — touches on a range of topics, most importantly the extent to which TikTok knew how much time young users were spending on the platform and how sincere it was when rolling out tools aimed at curbing excessive use.
Beyond TikTok use among minors, the complaint alleges the short-form video sharing app has prioritized “beautiful people” on its platform and has noted internally that some of the content-moderation metrics it has publicized are “largely misleading.”
The unredacted complaint, which was seen by The Associated Press, was sealed by a Kentucky state judge on Wednesday after state officials filed an emergency motion to seal it.
When reached for comment, TikTok spokesperson Alex Haurek said: “It is highly irresponsible of the Associated Press to publish information that is under a court seal. Unfortunately, this complaint cherry-picks misleading quotes and takes outdated documents out of context to misrepresent our commitment to community safety.”
“We have robust safeguards, which include proactively removing suspected underage users, and we have voluntarily launched safety features such as default screentime limits, family pairing, and privacy by default for minors under 16,” Haurek said in a prepared statement. “We stand by these efforts.”
TikTok use among young users
The complaint alleges that TikTok has quantified how long it takes for young users to get hooked on the platform, and shared the findings internally in presentations aimed at increasing user-retention rates. The “habit moment,” as TikTok calls it, occurs when users have watched 260 videos or more during the first week of having a TikTok account. This can happen in under 35 minutes since some TikTok videos run as short as 8 seconds, the complaint says.
Kentucky’s lawsuit also cites a spring 2020 presentation from TikTok that concluded that the platform had already “hit a ceiling” among young users. At that point, the company’s estimates showed at least 95% of smartphone users under 17 used TikTok at least monthly, the complaint notes.
TikTok tracks metrics for young users, including how long young users spend watching videos and how many of them use the platform every day. The company uses the information it gleans from these reviews to feed its algorithm, which tailors content to people’s interests, and drives user engagement, the complaint says.
TikTok does its own internal studies to find out how the platform is impacting users. The lawsuit cites one group within the company, called “TikTank,” which noted in an internal report that compulsive usage was “rampant” on the platform. It also quotes an unnamed executive who said kids watch TikTok because the algorithm is “really good.”
“But I think we need to be cognizant of what it might mean for other opportunities. And when I say other opportunities, I literally mean sleep, and eating, and moving around the room, and looking at somebody in the eyes,” the unnamed executive said, according to the complaint.
Time management tools
TikTok has a 60-minute daily screen time limit for minors, a feature it rolled out in March 2023 with the stated aim of helping teens manage their time on the platform. But Kentucky’s complaint argues that the time limit — which users can easily bypass or disable — was intended more as a public relations tool than anything else.
The lawsuit says TikTok measured the success of the time limit feature not by whether it reduced the time teens spent on the platform, but by three other metrics — the first of which was “improving public trust in the TikTok platform via media coverage.”
Reducing screen time among teens was not included as a success metric, the lawsuit said. In fact, it alleged the company had planned to “revisit the design” of the feature if the time-limit feature had caused teens to reduce their TikTok usage by more than 10%.
TikTok ran an experiment and found the time-limit prompts shaved off just a minute and a half from the average time teens spent on the app — from 108.5 to 107 minutes per day, according to the complaint. But despite the lack of movement, TikTok did not try to make the feature more effective, Kentucky officials say. They allege the ineffectiveness of the feature was, in many ways, by design.
The complaint says a TikTok executive named Zhu Wenjia gave approval to the feature only if its impact on TikTok’s “core metrics” were minimal.
TikTok — including its CEO Shou Chew — have talked about the app’s various time management tools, including videos TikTok sends users to encourage them to get off the platform. But a TikTok executive said in an internal meeting those videos are “useful” talking points, but are “not altogether effective.”
TikTok has ‘prioritized beautiful people’ on its platform
In a section that details the negative impacts TikTok’s facial filters can have on users, Kentucky alleges that TikTok’s algorithm has “prioritized beautiful people” despite knowing internally that content on the platform could “perpetuate a narrow beauty norm.”
The complaint alleges TikTok changed its algorithm after an internal report noted the app was showing a high “volume of ... not attractive subjects” in the app’s main “For You” feed.
“By changing the TikTok algorithm to show fewer ‘not attractive subjects’ in the For You feed, Defendants took active steps to promote a narrow beauty norm even though it could negatively impact their young users,” the complaint says.
TikTok’s ‘leakage’ rates
The lawsuit also takes aim at TikTok’s content-moderation practices.
It cites internal communication where the company notes its moderation metrics are “largely misleading” because “we are good at moderating the content we capture, but these metrics do not account for the content that we miss.”
The complaint notes that TikTok knows it has — but does not disclose — significant “leakage” rates, or content that violates the site’s community guidelines but is not removed or moderated. Other social media companies also face similar issues on their platforms.
For TikTok, the complaint notes the “leakage” rates include roughly 36% of content that normalizes pedophilia and 50% of content that glorifies minor sexual assault.
The lawsuit also accuses the company of misleading the public about its moderation and allowing some popular creators who were deemed to be “high value” to post content that violates the site’s guidelines.