Don’t miss the latest developments in business and finance.

No CSAM content on platform, govt didn't share any evidence: YouTube

The company has submitted its formal response to the government on the issue

YouTube
YouTube
Ashutosh Mishra New Delhi
2 min read Last Updated : Oct 16 2023 | 6:34 PM IST
YouTube announced on Monday that it has not found any child sexual abuse material (CSAM) on its platform, despite conducting multiple thorough investigations. The statement comes after the Ministry of Electronics and Information Technology (MeitY) directed social media platforms, including YouTube, Telegram, and X (formerly Twitter), to proactively remove any CSAM content. MeitY warned that failure to comply could lead to the loss of 'safe harbour' immunity, although it did not specify why these three platforms were particularly targeted.

YouTube noted that the regulators have not provided any evidence to suggest the presence of child abuse content on its platform. "Based on our investigations, we found no evidence of CSAM on YouTube. We are committed to preventing the spread of such content and will continue to invest heavily in the technologies and teams that detect and remove it," said a spokesperson for YouTube.

According to the company's internal data, YouTube removed over 94,000 channels and more than 2.5 million videos for violating its child safety policy in the second quarter of FY24. YouTube, which boasts around 467 million active users in India—its largest user base globally—has submitted its formal response to the government on this issue.

Earlier this month, Rajeev Chandrasekhar stated, "We have issued notices to prominent platforms, urging them to clean their platforms of explicit content that makes children vulnerable to exploitation." The notices also recommended the implementation of proactive measures, such as content moderation algorithms and reporting mechanisms, to prevent future dissemination of CSAM.

In 2018, Google launched a content safety application programming interface (API) that uses artificial intelligence classifiers to help organisations identify and prioritise CSAM content for review. YouTube's child safety policy expressly prohibits sexually explicit content featuring minors and any content that sexually exploits them.

More From This Section

Topics :YouTubevideo streamingchild sexual abuseTwitter

First Published: Oct 16 2023 | 6:30 PM IST

Next Story