Facebook has announced to have removed 1.5 million videos in the first 24 hours following the deadly attack at mosques in New Zealand that left 50 people dead, and an equal number of people injured.
Out of the 1.5 million videos, 1.2 million were blocked at upload. Facebook's Mia Garlick said that the company is working towards removing violating content using a combination of technology and people.
To remove content that is in violation of Facebook policies, the company has a dedicated team of human moderators as well as AI-enabled systems in place which identify and flag off inappropriate content.
Following the attacks in New Zealand, Facebook, YouTube, and Reddit took measures to remove accounts sharing the violent footage of the attack, which was live streamed by the gunman.
In addition to removing the actual graphical video circulating through its platform, Facebook is also removing the edited versions of the video that do not show graphic content to curb the spread.
Disclaimer: No Business Standard Journalist was involved in creation of this content