Finance

Meta fixes error that flooded Instagram Reels with violent videos

Published by Global Banking & Finance Review

Posted on February 27, 2025

2 min read

· Last updated: January 25, 2026

Add as preferred source on Google
Meta fixes error that flooded Instagram Reels with violent videos
Global Banking & Finance Awards 2026 — Call for Entries

(Reuters) - Meta Platforms said on Thursday it had resolved an error that flooded the personal Reels feeds of Instagram users with violent and graphic videos worldwide. It was not immediately clear

Meta fixes error that flooded Instagram Reels with violent videos

(Reuters) - Meta Platforms said on Thursday it had resolved an error that flooded the personal Reels feeds of Instagram users with violent and graphic videos worldwide.

It was not immediately clear how many people were affected by the glitch. Meta's comments followed a wave of complaints on social media about violent and "not safe for work" content in Reels feeds, despite some users having enabled the "sensitive content control" setting meant to filter such material.

"We have fixed an error that caused some users to see content in their Instagram Reels feed that should not have been recommended. We apologize for the mistake," a spokesperson for Meta said.

It did not disclose the reason behind the error.

Meta's moderation policies have come under scrutiny after it decided last month to scrap its U.S. fact-checking program on Facebook, Instagram and Threads, three of the world's biggest social media platforms with more than 3 billion users globally.

Violent and graphic videos are prohibited under Meta's policy and the company usually removes such content to protect users, barring exceptions given for videos that raise awareness on topics including human rights abuse and conflict.

The company has in recent years been leaning more on its automated moderation tools, a tactic that is expected to accelerate with the shift away from fact-checking in the United States.

Meta has faced criticism for failing to effectively balance content recommendations and user safety, as seen in incidents like the spread of violent content during the Myanmar genocide, Instagram promoting eating disorder content to teens and misinformation during the COVID-19 pandemic.

(Reporting by Surbhi Misra and Akash Sriram in Bengaluru; Editing by Saumyadeb Chakrabarty)

Key Takeaways

  • Meta fixed an error causing violent videos in Instagram Reels.
  • The error affected users globally despite content controls.
  • Meta's moderation policies are under scrutiny.
  • Automated moderation tools are increasingly used by Meta.
  • Meta recently ended its U.S. fact-checking program.

Frequently Asked Questions

What error did Meta resolve regarding Instagram Reels?
Meta resolved an error that caused users' Reels feeds to be flooded with violent and graphic videos that should not have been recommended.
How did users react to the content issue on Instagram?
Users expressed their concerns on social media, complaining about the presence of violent and 'not safe for work' content in their Reels feeds.
What has been the criticism of Meta's content moderation?
Meta has faced criticism for failing to effectively balance content recommendations and user safety, particularly highlighted by past incidents like the spread of violent content during the Myanmar genocide.
What are Meta's policies regarding violent content?
Violent and graphic videos are prohibited under Meta's policy, and the company typically removes such content to protect users, except for videos that raise awareness on certain topics.
What changes has Meta made to its moderation approach?
In recent years, Meta has increasingly relied on automated moderation tools, a strategy expected to grow with the discontinuation of its U.S. fact-checking program.

Tags

Related Articles

More from Finance

Explore more articles in the Finance category