The Asia Securities Industry & Financial Markets Association (Asifma) has raised objections to the Securities and Exchange Board of India’s (Sebi’s) proposed norms for regulating artificial intelligence (AI) and machine learning (ML) tools.
The regulator had floated a consultation paper in November, aiming to bring the usage of AI and ML tools under the regulatory ambit.
Further, it looked to define responsibilities of registered entities (RE) for lapses such as data breach, privacy concerns, investor data, and action in case of violations.
In its submission to Sebi, the foreign portfolio investor (FPI) lobby argued that a one-size-fits-all approach could lead to over regulation.
Instead, the industry body advocated for a shared responsibility framework, where financial institutions remain accountable. However, third-party providers are responsible for specific parts of the AI value chain.
Also Read
“An intermediary cannot be held responsible for any bad decisions by a client or external stakeholder based on the otherwise accurate and fair output of an AI tool,” Asifma said in its submission.
“While financial institutions will remain accountable, the responsibility of various controls in the generative AI lifecycle will depend on the deployment model. We suggest that responsibility (and liability) should lie with the party who has control over the specific element of the life cycle,” it further added.
Asifma also pointed out the lack of clarity regarding reliance on AI tool outputs and recommended using the OECD definition of AI systems in the regulation.
“The way this requirement currently reads is as if when a client or stakeholder is using accurate and fair output from an AI tool, but then makes his/her own bad decision, this would be the responsibility of the RE. If that is indeed Sebi's intention, we are very concerned that this would be a significant overreach and be an outlier as compared to requirements and guidelines in other jurisdictions,” Asifma has pointed.
Additionally, the industry body expressed concerns that Sebi’s proposed norms could lead to significant overreach and be inconsistent with requirements in other jurisdictions.
Sebi has already specified reporting requirements of AI and ML apps and tools used by stock brokers, exchanges and asset management companies, among others.
Case file
Sebi had proposed norms to bring accountability on the usage of AI/ML tools
Measures aimed at bringing responsibility on registered entities for any lapse or data breach
Calls for usage of OECD’s definition of AI in the regulation