Don’t miss the latest developments in business and finance.

Internet will never be the same again

Under the DSA, users will have to be informed if their content is taken down, along with a statement of reasons

smartphone, mobile, consumer, digital, online, telecom, internet, connectivity
Nikhil Pahwa
5 min read Last Updated : Apr 29 2022 | 2:19 AM IST
A website called Run your own social has a guide on how to set up your own social network, for those who are “tired of Facebook or Twitter or wherever else and have thought that there’s got to be a better way”. While scarce on the technical details, it suggests creating a code of conduct, reminding you that: “You get to make the social rules and policies”. It reminds me of how the Internet used to be in the early 2000s, with multiple social networks and online forums to choose from. Each was its own benevolent dictatorship.

Large social networks today operate with similar ambiguity and arbitrariness. The winner-takes-all nature of the Internet has meant that in 2022, a few platforms like Facebook, Twitter, YouTube and TikTok dominate.

What allows all such spaces — big and small — to exist is the legal concept of safe harbour that governs online platforms. It allows platforms, as enablers and regulators of speech, to maintain a semblance of civility by enforcing their own code of conduct, without the liability of billions of messages and hours of video posted every day, which are next to impossible to police perfectly. They still need to take down illegal content when it is reported to them.

Additionally, the collection of vast amounts of data on users and behavioural monitoring and targeting mean that these platforms have immense power to control speech, and affect the outcome of elections in democracies: They’ve become gatekeepers. Algorithms allow them to control what people see and what gets censored. A Facebook research in 2012 involved manipulating the mood of a set of users by altering the updates they saw. We’re thus dependent on the benevolence of these platforms to not harm us or our democracies.

Regulation is now trying to contain the power of these platforms, and add responsibilities to balance the freedom that safe harbour provides. While India enforced the IT Rules 2021 last year, the European Union’s proposed Digital Services Act (DSA), will be applicable from January 1, 2024. The DSA imposes new responsibilities on social networks and content-sharing platforms, among other online businesses in the EU. There are greater compliance burdens for large online platforms with 45 million users or more.

While the final draft is still not public, there’s emphasis on content takedowns and transparency, and there are some similarities with regulations in India. The EU member states can issue orders to platforms to act on illegal content. India has similar provisions — Section 69 (a) of the IT Act allows the government to issue orders to block content, although, unlike what is proposed in the EU, these orders lack transparency. Like India, the EU has also failed to define clear norms and restrictions for takedown orders from governments.

Platforms have to establish a single point of contact in the EU, who will be held liable for non-compliance with obligations. Very large online platforms in the EU are also required to have a compliance officer. In India, they’re required to hire a grievance officer, and the compliance officer here has criminal liability. Platforms in the EU have to enable users to notify them of illegal content, with priority given to “trusted flaggers” who will be designated by EU member states for their expertise in identifying illegal content. The receipt of complaints will mean that platforms will then have “actual knowledge” of the illegal content, and can thus be held liable if they do not act upon it. In India, such a provision existed prior to 2015, but it was watered down by the Supreme Court judgment in the Shreya Singhal case. Given the rampant misuse of this provision to censor content, the court ruled that “actual knowledge” would have to be a government or a court order. To address misuse, the DSA allows platforms to suspend filing of complaints against repeat misuse of the provision. Platforms will need to inform law enforcement agencies when they become aware of serious criminal offences involving a threat to life, or if one is likely to take place. Crisis protocols will eventually be drawn up for “extraordinary situations”, though it’s not clear what exactly this will entail.

Under the DSA, users will have to be informed if their content is taken down, along with a statement of reasons. Similar to India’s IT Rules, the DSA will allow users to file a grievance if their content gets disabled, or their account is suspended or terminated.

Platforms will have to publish detailed transparency reports at least once a year, including with information on takedown orders received, action initiated on their own and based on user complaints. The DSA will mandate that very large online platforms explain to users how their content recommendation algorithms work, and give them an option of a recommendation system that is not based on profiling. Personalisation of advertising will have to be explained to the recipient of that advertisement. The EU will also draw up an online advertising code of conduct.

Very large online platforms will have to do extensive annual risk assessments on dissemination of illegal content, negative effects on free speech, privacy, prohibition of discrimination and the rights of children, as well as provide vetted researchers with access to data to conduct research on systemic risks. They’ll have to implement measures to prevent intentional manipulation of their service that impacts protection of public health, minors, and importantly, electoral processes.

It’s likely that some of these provisions will find their way into the impending amendment to India’s IT Act. While the amendment will most likely reduce the power of platforms, there is concern that it might end up empowering the government rather than the users. Another thing is also clear with these regulatory changes: Running a social network will become compliance heavy and expensive. This will only entrench the existing large social networks that can afford to comply with these changes.
The writer is the founder of MediaNama

More From This Section

Disclaimer: These are personal views of the writer. They do not necessarily reflect the opinion of www.business-standard.com or the Business Standard newspaper

Topics :InternetBS OpinionSocial network

Next Story