The Standing Committee on Information Technology is considering a set of recommendations for “a stronger law” to deal with misinformation, and defamatory and obscene content on social media platforms, sources said. The measures could be presented in Parliament in the Winter Session, they said.
On Monday, the standing committee questioned digital platforms about their measures to stop the spread of misinformation ahead of the upcoming Assembly and Lok Sabha elections. Representatives from YouTube, X (formerly known as Twitter), WhatsApp, Facebook, Instagram, and Snapchat, as well as Indian apps Koo, ShareChat, and Dailyhunt, were asked to provide written submissions within the next 10-15 days, according to a government source.
The standing committee had asked these platforms to send their representatives to Mumbai to discuss a list of points with officials of the Ministry of Electronics and Information Technology (MeitY) and members of Parliament, who are a part of the committee. MeitY officials will direct these platforms on the exact details they need to submit before the committee.
“We have seen in the past that digital platforms are increasingly being used for sharing fake news, obscene content, and causing financial damage to other users. This has also led to widespread violence in recent days. Morphed content using artificial intelligence (AI) has further worsened this. Therefore, today the platforms were questioned on their obligation of content moderation and related regulations,” an official told Business Standard.
The Indian government is reportedly drafting the Digital India Bill (DIB), which could replace the country’s primary digital law, the Information Technology Act of 2000. The new legislation, according to an early draft, could grant the government increased powers to track, monitor, intercept, moderate, and take down online content. It could also allow the government to determine which intermediaries are exempt from liability for third-party digital communication or records.
This development follows several instances where the government warned social media platforms for failing to comply with India’s content moderation laws. The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, require intermediaries to remove unlawful information within 36 hours of receiving an order from a court or government agency.
Members of the Standing Committee have expressed dissatisfaction over inefficiencies and delays in addressing user grievances about content on social media platforms. They questioned whether inaction by digital platforms has created issues for users in appealing against the decisions of grievance officers. The government-appointed grievance appellate committees (GACs) have received only 218 appeals in the seven months since the online portal was activated.
The 31-member committee is currently chaired by Shiv Sena MP Prataprao Jadhav. Representatives of Telegram and LinkedIn did not attend the meeting, said a person present there.
Government officials have repeatedly warned social media companies of criminal penalties for failing to proactively comply with orders. In 2021, Twitter was named in a first information report (FIR) filed by the Uttar Pradesh Police over a tweet containing misinformation.
Last week, MeitY sent notices to X, Telegram, and YouTube, asking them to proactively remove any child sexual abuse material (CSAM) shared via their platforms. The ministry also called for the implementation of proactive measures such as content moderation algorithms and reporting mechanisms to prevent future dissemination of CSAM.