Instagrams algorithm routinely serves up sexually charged videos featuring scantily-clad sex content creators to teen users as young as 13, according to the alarming results of a seven-month analysis that were published Thursday.
The Wall Street Journal and a Northeastern University researcher probed the Mark Zuckerberg-led apps filters by creating accounts posing as fictional 13-year-olds and scrolling Instagrams Reels video feed which reportedly began showcasing suggesting content almost immediately.
At first, the racy videos included women dancing suggestively or showcasing their breasts, the report said.
As the accounts watched those videos while skipping others, the content became more graphic with videos featuring online sex workers who promised to send nude images to viewers appearing in less than 20 minutes, according to the Journal.
In one set of tests conducted in June, the Journal said Instagram began showing video after video about anal sex to a fictional 13-year-old who previously watched videos about women on the Reels feed.
In other cases, the recommended algorithm served up videos of women caressing their bodies, mimicking sex acts or even flashing their genitalia to the camera, the Journal said.
The racy videos occasionally appeared along ads for major corporate brands, the report said.
Meta pushed back on the reports findings, with spokesman Andy Stone asserting it was an artificial experiment that doesnt match the reality of how teens use Instagram.
As part of our long-running work on youth issues, we established an effort to further reduce the volume of sensitive content teens might see on Instagram, and have meaningfully reduced these numbers in the past few months, Stone added.
The Post has reached out for comment.
The analysis was conducted by the Journal over a seven-month period stretching from January through June, according to the report. Northeastern University computer science professor Laura Edelson also replicated the test results.
The test accounts did not follow any other accounts or like any posts. To test how quickly Instagram ramped up illicit recommendations, the testers scrolled through Reels watching the sexually-charged videos while skipping others.
The Journal said it conducted similar tests on Snapchat and TikTok, neither of which recommended sexually graphic content to the test accounts under similar conditions.
Meanwhile, current and former Meta employees told the outlet that internal tests had uncovered problems with inappropriate content being served to underage users as far back as 2021.
In one internal report from 2022, Meta reportedly found that teen users viewed three times as many posts featuring nudity as adults.
Meta has repeatedly said that it is taking steps to ensure teen users have age-appropriate experiences on its apps.
The Journal uncovered the illicit content even Meta rolled out stricter content controls in January meant to prevent teen users from being exposed to inappropriate content.
As part of those new restrictions, users under age 16 are supposed to be blocked from seeing sexually explicit content in their feeds.
The damning report marks yet another headache for Meta, which currently faces as sweeping federal lawsuit from dozens of states who allege the companys apps have fueled a youth mental health crisis.
Meta is also being sued by the state of New Mexico in a separate suit alleging the company has failed to protect underage users from sexual predators active on its apps.
As The Post reported, a filing from that lawsuit revealed that executives from Walmart and Tinder parent Match Group confronted Meta after learning their ads were running next to content that sexualized underage users.
In January, Meta CEO Mark Zuckerberg issued a stunning apology to the families of victims of online child sex abuse during a high-profile hearing on Capitol Hill.
No one should go through the things that your families have suffered, Zuckerberg said at the time. And this is why we invest so much and we are going to continue doing industry-wide efforts to make sure no one has to go through the things your families have had to suffer.