https://www.linkedin.com/posts/marie-potel-saville_tiktok-knows-exactly-how-much-time-it-takes-share-7450423802775285760-y2Q3
#TikTok knows exactly how much time it takes to get you addicted to their algorithm : 35 minutes.
According to internal documents revealed in a lawsuit, a user is likely to become addicted after 260 videos.
At 8 seconds per video, that's ~35 minutes.
We only know this because of a legal accident. In 2024, 14 US attorneys general sued TikTok for deliberately addicting teenagers.
(1/5)

TikTok knows exactly how much time it takes to get you addicted to their algorithm : 35 minutes.
According to internal documents revealed in a lawsuit, a user is likely to become addicted after 260… | Marie Potel-Saville | 30 comments
TikTok knows exactly how much time it takes to get you addicted to their algorithm : 35 minutes.
According to internal documents revealed in a lawsuit, a user is likely to become addicted after 260 videos.
At 8 seconds per video, that's ~35 minutes.
We only know this because of a legal accident. In 2024, 14 US attorneys general sued TikTok for deliberately addicting teenagers.
In one of the lawsuits, the redactions were faulty and 30 pages of internal documents became public.
What they revealed is hard to read.
TikTok's own research found that “compulsive usage correlates with loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, and increased anxiety”.
They knew, they documented it and chose to keep building anyway.
Instead of trying to reduce screen time among teenagers, they built time-management tools to improve "public trust in the TikTok platform via media coverage."
The tobacco industry used the same playbook for 40 years, they called it "problematic use" too and tried to shift responsibility to the consumers.
But we now have the documents, and the courts are starting to use them.
TikTok is not an isolated case: Meta, YouTube and others use the same logic. They all have the same type of internal documents, and make the same choice everyday.
When an entire market is designed to exploit human cognitive weaknesses at scale, it’s no longer a market in the economic sense (ie optimal allocation of resources and best benefits for consumers), it’s just a predatory system.
I now call it « predatory design ».
Of course, regulation is necessary. But laws are not enough alone, and fines cannot undo a decade of engineered addiction.
The only answer to a systemic problem is a systemic solution: technology that puts human autonomy back at the center of digital design.
We need Human Safety Tech, to protect all citizens and especially the youngest, that are more vulnerable.
That's what we're building at Fairpatterns, and we’re only starting! | 30 comments on LinkedIn
LinkedIn