Instagram Shows Se*ually Explicit Content To Young And Children Accounts; WSJ Report

Meta is currently facing allegations because Instagram’s  Reel algorithm has been delivering overtly s*xual content to those accounts which only follow children. The algorithm is also showing ads for big brands alongside them. 

A new report from the Wall Street Journal tested the Instagram’s algorithm by only following the accounts of “young gymnasts, cheerleaders, and other teen and preteen influencers”. Basically they followed only those accounts which were devoid of any se*ual content and involved children. 

The experiment found that Instagram started recommending s*xual content to its test users that included both provocative and adult videos. The experiment also found out that the child users which the test accounts followed, were also followed by adult men . 

Not only this, it was further found by the report that Instagram reels also displayed ads for big brands like Pizza Hut, Disney, Wal-Mart, Bumble, Match Group etc alongside the sexually explicit content. 

After this, dating companies like Bumble and Match Group suspended their instagram advertising  objecting to mixing their ads with adult content.

As per Meta’s Samantha Stetson, the test results are  “based on a manufactured experience that does not represent what billions of people around the world see.”

She further adds, “We don’t want this kind of content on our platforms and brands don’t want their ads to appear next to it. We continue to invest aggressively to stop it — and report every quarter on the prevalence of such content, which remains very low.”

She concludes, “Our systems are effective at reducing harmful content, and we’ve invested billions in safety, security and brand suitability solutions.”

Share on: