To deal with the problem of non-consensual intimate images, often referred to as revenge porn, on its platform, Facebook on Friday announced new detection technology and an online support hub to help people respond when this abuse occurs.
Called "Not Without My Consent," the resource hub will help victims find organisations and resources to support them, including steps they can take to remove the content from Facebook and prevent it from being shared further, Antigone Davis, Facebook's Global Head of Safety, said in a statement.
"We're also going to make it easier and more intuitive for victims to report when their intimate images were shared on Facebook," Davis said.
Facebook said it will also build a victim support toolkit to give people around the world more information with locally and culturally relevant support.
"By using machine learning and artificial intelligence, we can now proactively detect near nude images or videos that are shared without permission on Facebook and Instagram," Davis said.
"This means we can find this content before anyone reports it, which is important for two reasons: often victims are afraid of retribution so they are reluctant to report the content themselves or are unaware the content has been shared," he added.
More From This Section
A specially-trained member of Facebook's Community Operations team will review the content found by its technology.
"If the image or video violates our Community Standards, we will remove it, and in most cases we will also disable an account for sharing intimate content without permission," Davis added.
--IANS
gb/bg
Disclaimer: No Business Standard Journalist was involved in creation of this content