Texas Attorney General Ken Paxton filed a lawsuit against Snapchat on Thursday, alleging the social media platform fails to adequately warn users about its addictive features and exposes minors to inappropriate content. The lawsuit, announced February 12, 2026, centers on claims that Snapchat intentionally designs its application to be habit-forming, without providing sufficient safeguards for young people or informing parents about potential risks.
According to the complaint, Snapchat’s design choices prioritize user engagement over safety, leading to excessive use and potential harm, particularly among adolescents. The lawsuit asserts that the platform does not provide adequate warnings about the addictive nature of its features. Paxton’s office alleges that Snapchat’s policies regarding child safety are misleading.
The legal action mirrors growing concerns about the impact of social media on youth mental health and well-being. Snapchat allows users to send photos and videos, known as “snaps,” that disappear after being viewed. This ephemeral nature of content is a key feature of the platform, but also a point of contention for parents and safety advocates. A recent guide for parents highlights the difficulty in monitoring Snapchat due to the disappearing message feature, noting that the platform does not offer parental controls to view private chats.
Snapchat provides a reporting mechanism within the app, allowing users to flag abusive or illegal content by holding down on a snap or story and selecting “Report Snap.” Users can also report concerns through the Snapchat website. However, the lawsuit suggests these measures are insufficient to protect users from harmful content and addictive behaviors.
The Attorney General’s office has not specified the remedies sought in the lawsuit, but It’s expected to include demands for changes to Snapchat’s design and policies, as well as increased transparency regarding its algorithms and data collection practices. As of Friday, February 13, 2026, Snapchat has not issued a public response to the lawsuit.
Separately, legislators are also considering action against another artificial intelligence platform, Grok, due to its potential to generate sexually explicit images of minors and women, according to reports.