Don’t miss the latest developments in business and finance.

No online targeting

New EU law should become the global benchmark

GDPR
Business Standard Editorial Comment Mumbai
3 min read Last Updated : Jan 27 2022 | 11:31 PM IST
The European Parliament has passed the Digital Services Act (DSA), a piece of landmark legislation that should become the global benchmark for the provision of big tech services, while protecting the privacy of users. The EU already had the General Data Protection Regulation (GDPR), which offers granular privacy protection to anybody (not only EU citizens) whose data is stored within the EU. The DSA lays down the dos and don’ts about the removal of harmful, illegal content, offering or facilitating sales of illegal products or services, targeted advertising, and the way interfaces are designed. Large online platforms should assess in mandatory annual or semi-annual risk assessments about the dissemination of illegal content, the malfunctioning of the given service, and any “actual and foreseeable negative effects on the protection of public health”. The latter clause is a response to the pandemic, and the dissemination of misinformation centred round it.

The usage of certain data to target users for advertising has been prohibited. The GDPR classifies data about race, ethnicity, political opinions, religious beliefs, health and sexual orientation, as special categories. An online platform may not use such data under the DSA for targeting individuals for advertising. “Dark Patterns” are also forbidden. There are new requirements aiming to help in the tackling of malicious “deep fakes”. Dark patterns appear when websites and apps are designed to induce users to subscribe to services, or to click through to affiliate websites, or to advertising. The Act prohibits platforms from using “the structure, function or manner of operation of their online interface, or any part thereof, to distort or impair recipients of services’ ability to make a free, autonomous and informed decision or choice”. It also cites alleged practices that “exploit cognitive biases and prompt recipients of the service to purchase goods and services that they do not want, or to reveal personal information they would prefer not to disclose”.
 
Intermediaries must also refrain from designing websites in ways that give more visual prominence to any of the consent options when asking the recipient of the service for a decision. The use of targeting techniques that process, reveal, or infer personal data of minors for displaying advertisements is also prohibited. Deep fakes are cleverly manipulated images or videos, using AI to splice in existing persons, objects, places, or events. These can be falsely presented as authentic, or truthful. While this may be done for creative purposes or for satire, such content is often deployed for malicious reasons as well. The DSA demands that deep fakes be clearly labelled in a way that makes it obvious that the content is inauthentic. Online platforms must provide meaningful information about how their data will be monetised to recipients to enable informed consent to the processing of personal data for advertising. Moreover, platforms would be prohibited from disabling users’ access to functionalities if they refuse to consent to the processing of their personal data.

Read together, the GDPR and the DSA give surfers in the EU the most comprehensive protections that exist. While platforms would be able to generate revenues from advertising, and subscriptions, they would only be able to do so with the informed consent of adult users. Contrast this with the abysmal lack of privacy protection in India, and it is evident our legislators have not translated the fundamental right to privacy from a theoretical construct into something that works in practice.

Topics :GDPRDigital servicesBusiness Standard Editorial Comment

Next Story