"While the DSA has created an obligation for platforms to identify and mitigate systemic risks in Europe, the first two years of risk assessments rely heavily on high-level company descriptions of policies, tools, and user controls. Assessments provide extremely limited detail into whether any of these interventions meaningfully reduce harm, particularly for minors. By contrast, US litigation is surfacing previously unreleased internal platform data, experiments, and deliberations that reveal how platforms internally measure risk and define acceptable trade-offs related to risk, engagement, and revenue. But US litigation is largely reactive and limited to the facts of each specific case.
For example, internal company data released in US litigation shows that key safety mitigations – including screentime management tools, take a break reminders, parental controls, among others – suffer from extremely low adoption rates, often below 2% of minor users. Internal documents also suggest the design of these features may undermine effectiveness: TikTok leadership initially imposed “guardrail” metrics requiring that new screentime tools reduce usage by no more than 5%, while Meta’s internal projections accurately predicted that 99% of teens would not use optional opt-in take a break features.
The evidence emerging from DSA systemic risk assessments and US platform litigation underscores a central gap in current approaches to platform governance: risks are increasingly well-described, but mitigations are rarely communicated using rigorous, outcome-oriented data and evidence."
https://kgi.georgetown.edu/research-and-commentary/measuring-risk-what-eu-risk-assessments-and-us-litigation-reveal-about-meta-and-tiktok/
#SocialMedia #EU #USA #DSA #TikTok #Instagram #Algorithms #Meta #Facebook #PlatformGovernance #MentalHealth