A recently disclosed internal survey from social media company Meta reveals that a significant number of young adolescents using Instagram have encountered nude or sexual images they didn’t want to see. The findings, made public on February 23, 2026 as part of legal filings in a federal lawsuit, raise fresh concerns about the safety of social platforms for minors.

According to a 2021 self-reported survey, almost 19% of users aged 13 to 15 told Meta they had seen nudity or sexual images on Instagram that they did not choose to view. This information emerged during a deposition of Instagram head Adam Mosseri in March 2025, which was included in court records made public last week.

Meta says the figure came directly from what teens told the company, not from Meta reviewing content itself, and the company acknowledged that self-reported surveys can be “notoriously problematic.” Most of the explicit images reported by teens were shared through private messages, creating challenges for moderation because of user privacy protections.

In addition to sexual content, about 8% of young teens said they had seen someone harming themselves or threatening self-harm on Instagram in the survey.

Meta’s Response and Safety Policies

Meta, which owns Instagram, has faced increasing legal and public scrutiny over how its platforms affect young people’s wellbeing. In late 2025, the company updated its safety policies to remove sexually explicit images and videos involving teens — including those created with artificial intelligence — with limited exceptions for educational or medical use.

Despite these policy changes, critics and safety experts say more needs to be done to protect minors from exposure to inappropriate content and harmful interactions online.

Broader Context

The release of these survey results comes amid a wave of lawsuits and regulatory actions targeting Meta and other big tech companies for alleged harms to children and teens. Plaintiffs in the U.S. argue that social media platforms contribute to mental health problems, addictive behavior, and unsafe online environments for youth.

Experts say this new data highlights ongoing challenges for social media companies in balancing user privacy with the need for effective content moderation — especially when it comes to protecting minors.
For teens and families, the report underscores the importance of digital literacy, open communication about online safety, and understanding how to report or block unwanted content.

Works Cited

Reuters. “Meta Survey Found 19% of Young Teens on Instagram Report Seeing Unwanted Nude Images.” Reuters, 23 Feb. 2026, www.reuters.com/legal/litigation/meta-survey-found-19-young-teen-instagram-users-saw-unwanted-nude-or-sexual-2026-02-23/.

Leave a comment