Meta has admitted to CNBC that Instagram is facing a bug that is causing users’ accounts to be flooded with videos in the Reels feed that are not usually displayed in its algorithms. “We are fixing a bug that caused some users to see content in their Instagram Reels feed that should not have been recommended,” the company said. “We apologize for the error.” Users took to social media platforms to ask other people if they had recently received Reels containing violence and sexual content. One user on Reddit said that his Reels pages were filled with school shootings and murders.
Others say they receive videos with gory scenes such as stabbings, beheadings and castrations, nudity, obscene porn, and live rape. Some said they still see such videos even if they have turned on the sensitive content control feature. Social media algorithms are designed to show you videos and other content similar to what you normally watch, read, like, or interact with. However, in this case, Instagram was showing graphic videos even to those who had not interacted with such videos, and sometimes even after the user had taken the time to click the “Not Interested” button on a video with violent or sexual content.
A Meta spokesperson did not tell CNBC exactly what the error was, but some of the videos reported by people should not have appeared on Instagram based on the company’s own policies. “To protect users… we remove most graphic content and add warning labels to other graphic content so people know it may be sensitive or disturbing before they click on it,” the company’s policy says. Meta’s rules also state that it removes “real photos and videos of nudity and sexual activity.”









