Don’t miss the latest developments in business and finance.

Safe harbour for social media companies likely to become narrower

Digital India Bill may propose making them liable for content from non-verified users

digital india
Sourabh Lele
4 min read Last Updated : Apr 16 2023 | 10:42 PM IST
As part of its work on the proposed Digital India Bill, the government is redrawing the contours of the ‘safe harbour’ provided to internet intermediaries including social media companies to increase their accountability for user-generated unlawful content, sources said.

Anonymity and non-traceability of users on online platforms will no longer exist after the enactment of the Digital India Bill becomes law and replaces India’s primary digital law, the Information Technology Act of 2000, a senior official said. According to the primary discussions, the government may propose provisions to make social media platforms liable for content posted by non-verified user accounts.

“Most social media platforms already have user verification mechanisms. The responsibility of content generated by verified accounts may remain with the users,” the official said. With this, law enforcement agencies will be able to identify and track down the source of illegal activities more proactively, he added.

The Ministry of Electronics and Information Technology (Meity) is currently examining similar laws in other countries that define the liabilities of online intermediaries about the content they host. The best practices about safe harbour provisions in other countries may be adapted, wherever required.

“The principle of safe harbour will certainly change. Our objective is clear – one should be accountable for what one is doing. Online platforms have control over their functionalities and so they should take responsibility,” the official said.

Sources familiar with the matter said the content generated or influenced by online intermediaries will need to be marked separately. “There will not be any exemption for that content,” said one such person.

Section 79 of the IT Act says an intermediary shall not be liable for any third-party information, data, or communication link made available or hosted by him. This provides a ‘safe harbour’ or immunity for online platforms from legal action against them for illegal content shared on the platform.

The ministry last month held the first round of consultations with industry stakeholders, policy advocates and legal experts on the broad principles of the Digital India Act (DIA). “There is a greater diversity and complexity about the platforms that are increasingly on the internet and therefore, there is this legitimate question: Should there be a safe harbour at all?” Minister of State for Electronics and IT, Rajeev Chandrasekhar, had asked the stakeholders.

He added that in the current situation, anonymity combined with platforms “pretending to be dumb intermediaries” has led to a situation of crime, illegality, and user harm.

In several instances, the government had warned social media platforms of losing such safe harbour for not complying with the laws of India. In 2021, Twitter was named a party to the First Information Report (FIR) filed by the Uttar Pradesh Police on a tweet with misinformation.

Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 require intermediaries to remove unlawful information within 36 hours after receiving an order from a court or government agencies. However, government officials have repeatedly expressed dissatisfaction with the efforts of social media companies to proactively comply with the orders.

Governments across the world have, in recent times, strengthened intermediary liability laws. The recently passed Digital Services Act of the European Union seeks to articulate clear responsibilities for online platforms. It imposes new mechanisms allowing users to flag illegal content online, and for platforms to cooperate with specialised ‘trusted flaggers' to identify and remove illegal content.

In June 2020, France passed Avia Bill, a new law that mandates social media intermediaries to remove 'obviously' illegal content within 24 hours, and content relating to terror and child abuse within an hour. Germany has the Network Enforcement Act, which mandates platforms to remove “manifestly illegal” content within 24 hours of having been notified of such content. Where content is not manifestly illegal, social media providers must remove the post in question within seven days. Non-compliance can lead to significant fines.

In Australia, if internet intermediaries are aware that their service can be used for accessing or sharing such material, but do not remove such content “expeditiously,” or do not make the details of such content known to law enforcement agencies, they may face penalties of up to 10 per cent of their annual turnover for each offence.
 
Primary discussions on Digital India Act indicate:
 
Platforms to be held liable for content from anonymous accounts
 
Policy to encourage voluntary verification of social media accounts
 
Content generated by inter-mediaries to be marked separately
 
The Act may adopt some intermediary liability provisions in other countries

Topics :Social MediaIT ministryIT law

Next Story