Business Standard

Company agrees to pay $1 mn fine for making calls mimicking Biden to voters

The call featured a voice similar to Biden's falsely suggesting that voting in the state's presidential primary would preclude them from casting ballots in the November general election

artificial intelligence business fintech

Lingo Telecom is a voice service provider

AP Meredith

Listen to This Article

A company that sent deceptive calls to New Hampshire voters using artificial intelligence to mimic President Joe Biden's voice agreed Wednesday to pay a $1 million fine, federal regulators said.

Lingo Telecom, the voice service provider that transmitted the robocalls, agreed to the settlement to resolve enforcement action taken by the Federal Communications Commission, which had initially sought a $2 million fine.

The case is seen by many as an unsettling early example of how AI might be used to influence groups of voters and democracy as a whole.

Meanwhile Steve Kramer, a political consultant who orchestrated the calls, still faces a proposed $6 million FCC fine as well as state criminal charges.

 

The phone messages were sent to thousands of New Hampshire voters on January 21. They featured a voice similar to Biden's falsely suggesting that voting in the state's presidential primary would preclude them from casting ballots in the November general election.

Kramer, who paid a magician and self-described digital nomad to create the recording, told The Associated Press earlier this year that he wasn't trying to influence the outcome of the primary, but he rather wanted to highlight the potential dangers of AI and spur lawmakers into action.

If found guilty, Kramer could face a prison sentence of up to seven years on a charge of voter suppression and a sentence of up to one year on a charge of impersonating a candidate.

The FCC said that as well as agreeing to the civil fine, Lingo Telecom had agreed to strict caller ID authentication rules and requirements and to more thoroughly verify the accuracy of the information provided by its customers and upstream providers.

Every one of us deserves to know that the voice on the line is exactly who they claim to be, FCC chairperson Jessica Rosenworcel said in a statement. If AI is being used, that should be made clear to any consumer, citizen, and voter who encounters it. The FCC will act when trust in our communications networks is on the line.

Lingo Telecom did not immediately respond to a request for comment. The company had earlier said it strongly disagreed with the FCC's action, calling it an attempt to impose new rules retroactively.

Nonprofit consumer advocacy group Public Citizen commended the FCC on its action. Co-president Robert Weissman said Rosenworcel got it exactly right by saying consumers have a right to know when they are receiving authentic content and when they are receiving AI-generated deepfakes. Weissman said the case illustrates how such deepfakes pose an existential threat to our democracy.

FCC Enforcement Bureau Chief Loyaan Egal said the combination of caller ID spoofing and generative AI voice-cloning technology posed a significant threat whether at the hands of domestic operatives seeking political advantage or sophisticated foreign adversaries conducting malign influence or election interference activities.


(Only the headline and picture of this report may have been reworked by the Business Standard staff; the rest of the content is auto-generated from a syndicated feed.)

Don't miss the most important news and views of the day. Get them on our Telegram channel

First Published: Aug 22 2024 | 8:26 AM IST

Explore News