Stop big tech from making users behave in ways they don't want to

· media · Source ↗

TLDR

  • A 2025 LA jury found Meta and YouTube liable for addictive product design after a confidential 2019 Meta slide concluded “Teens can’t switch off from Instagram even if they want to.”

Key Takeaways

  • The March 25 LA jury verdict is the first major ruling holding platforms liable for engineering addictive products, not just hosting harmful content.
  • Meta’s internal 2019 deck explicitly acknowledged teen users could not disengage voluntarily, directly informing the liability finding.
  • Author Marie Potel-Saville targets reward-system manipulation specifically, not just dark patterns, as the legal and regulatory frontier.
  • The ruling’s scope remains unsettled; implications for platform design obligations and future litigation are still being worked out.

Hacker News Comment Review

  • Commenters split on a core definitional problem: addictive design and dark patterns are not the same thing, and conflating them weakens both regulatory and legal arguments.
  • The legislative drafting challenge is real: no one in the thread produced a clean legal test that separates “feature users want” from “mechanism engineered to override user intent” without being trivially bypassed.
  • A secondary thread distinguishes addictive-by-choice platforms (Instagram, TikTok) from mandated-network platforms (Google Play, Microsoft 365), arguing the latter pose a harder and less-discussed coercion problem.

Notable Comments

  • @Animats: Separates addictive tech from mandated tech; flags Google Chrome’s push toward the mandated category as an underreported risk.
  • @cortesoft: Asks the drafting question directly: how do you write law that stops addictive loops without killing beneficial features or being bypassed?

Original | Discuss on HN