Democratic operative admits to commissioning Biden AI robocall in New Hampshire

Matt McClain/The Washington Post
Rep. Dean Phillips (D-Minn.) is interviewed in Nashua, N.H., in January.

A longtime Democratic consultant working for a rival candidate admitted that he commissioned the artificial intelligence-generated robocall of President Biden that was sent to New Hampshire voters in January and triggered a state criminal investigation.

Steve Kramer, who worked for the long-shot Democratic presidential candidate Dean Phillips, said in a phone interview with The Washington Post that he sent out the AI-generated robocall telling voters to not vote to “just under 5,000” people listed as “most likely Democrats” to vote in the New Hampshire primary, marking one of the first major uses of AI to disrupt the 2024 presidential election cycle.

The Phillips campaign paid Kramer roughly $250,000 to get Phillips, a third-term congressman from Minnesota challenging Biden, on the ballot in New York and Pennsylvania, according to federal campaign filings. Federal Communications Commission has issued him a subpoena for his involvement, Kramer said.

After the robocall, the Federal Communications Commission adopted a ruling that clarified generating a voice with AI for robocalls is illegal and swiftly issued a cease-and-desist letter to Kramer for “originating illegal spoofed robocalls using an AI-generated voice in New Hampshire” and issued a public notice to U.S.-based voice providers regarding blocking traffic related to the call.

“The agency is working diligently – including through all the tools available through its investigations – to ensure that harmful misuse of AI technologies do not compromise the integrity of our communications networks,” FCC spokesperson Will Wiquist said in a statement.

Kramer also shared details about how he created the robocall, confirming several details previously under speculation. He used software from the artificial intelligence voice cloning company Eleven Labs to create a deepfake voice of Biden in less than 30 minutes.

The calls, he added, were delivered by Voice Broadcasting, an entity associated with Life Co., which was at the center of the criminal investigation opened by New Hampshire Attorney General John Formella in early February into the Biden AI robocall. Kramer said the reason he created the robocall was to raise awareness about the dangers AI poses in political campaigns.

“If anybody can do it, what’s a person with real money, or an entity with real money, going to do?” he said.

Kramer’s incident highlights the ease and accessibility by which AI-generated technology is making its way into the 2024 campaign cycle, allowing nearly anyone to use a wide array of tools to inject chaos and confusion into the voting process.

It also foreshadows a new challenge for state regulators, as increasingly advanced AI tools create new opportunities to interfere in elections across the world by creating fake audio recordings, photos and even videos of candidates, muddying the waters of reality.

The New Hampshire attorney general’s investigation into the robocall “remains active and ongoing,” said Michael Garrity, a spokesman for the office.

Phillips and his campaign have condemned the robocalls. Katie Dolan, a spokeswoman for the Phillips campaign, said Kramer’s contract was finished before they became aware of his involvement in the robocall.

“We are disgusted to learn that Mr. Kramer is behind this call, and we absolutely denounce his actions,” she said. Kramer’s involvement was first reported by NBC News.

The robocall using an AI-generated voice that sounded like Biden targeted thousands of New Hampshire voters the weekend before the New Hampshire Democratic presidential primary, telling them their vote would not make a difference, according to investigators.

The call, which began with a catchphrase of Biden’s, calling the election “a bunch of malarkey,” told voters: “It’s important that you save your vote for the November election.” The call appeared to come from the number of the former New Hampshire Democratic Party chair Kathy Sullivan, who was helping an effort to get voters to write in Biden’s name to show their support for the president, even though he wasn’t on the ballot. Sullivan and others reported the call to the state’s attorney general.

In early February, Formella announced a criminal investigation into the matter, and sent the telecom company, Life Corp., a cease-and-desist letter ordering it to immediately stop violating the state’s laws against voter suppression in elections.

A multistate task force was also prepared for potential civil litigation against the company, and the FCC ordered Lingo Telecom to stop permitting illegal robocall traffic, after an industry consortium found that the Texas-based company carried the calls on its network.

“Don’t try it,” Formella said in the February news conference. “If you do, we will work together to investigate, we will work together with partners across the country to find you, and we will take any enforcement action available to us under the law. The consequences for your actions will be severe.”

The robocall incident is also one of several episodes that underscore the need for better policies within technology companies to ensure their AI services are not used to distort elections, AI experts said.

In late January, ChatGPT creator OpenAI banned a developer from using its tools after the developer built a bot mimicking Phillips. His campaign had supported the bot, but after The Post reported on it, OpenAI deemed that it broke rules against use of its tech for campaigns.

Paul Barrett, deputy director of the New York University Stern Center for Business and Human Rights, said in an email that it is apparent how powerful AI deepfakes can be in disrupting elections. “The new technology makes it far easier for nonexperts to generate highly persuasive content that is fraudulent and can potentially mislead people about when, how, or where to vote,” he said.

This is also not the first time Kramer has used AI to spoof a politician’s voice. Last year, he created an AI-generated robocall of Sen. Lindsey Graham (R-S.C.) asking nearly 300 “likely Republican” voters in South Carolina whom they would support if former president Donald Trump wasn’t on the ballot.

Kramer, who said he plans to support Biden if he wins the Democratic nomination, said he hopes his actions have inspired regulators to take notice of AI’s potential impact on the election.

“It’s here now,” he said, referring to AI, “and I did something about it.”