US Social Network Instagram logo illustration on tablet screen.
Kirill Kudryavtsev | AFP | Getty Images
Meta He apologized on Thursday and promised to correct his “mistakes.” As a result, some Instagram users have reported a flood of violent and graphic content recommended on their personal “reel” pages.
“We shouldn’t recommend that some users view content on their Instagram reel feeds. We apologize for the mistake,” a Meta spokesman said in a statement shared with CNBC.
The statement comes after many Instagram users used a variety of social media platforms to raise concerns about a recent feed of violent and “in-work-safe” content.
Some users claimed they saw such content even when Instagram’s “sensitive content control” was enabled for the best moderation settings.
According to META policy, the company is working to protect users from disturbing images and remove content that is particularly violent or graphic.
Prohibited content includes videos that “draw, visible internal organs, or burnt bodies” and content that includes “sadistic remarks on images depicting human and animal suffering.”
However, Meta says it allows some graphic content if it helps users to condemn and recognize the perception of important issues such as human rights abuses, armed conflict, and terrorist acts. Such content may have restrictions such as warning labels.
On Wednesday night in the US, CNBC was able to see some posts on its Instagram reel. This appeared to indicate corpses, graphic injuries and violent attacks. The post was labelled “sensitive content.”