×
AI robocall impersonator faces trial for fake Biden calls
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

The trial of a political consultant who used AI-generated Biden robocalls to manipulate voters highlights the growing intersection of artificial intelligence and electoral integrity. This landmark legal case tests both New Hampshire’s voter suppression laws and raises broader questions about AI regulation in politics, as states increasingly grapple with technology that can convincingly impersonate candidates and potentially interfere with democratic processes.

The big picture: Political consultant Steven Kramer faces 11 felony charges and 11 misdemeanors for sending AI-generated robocalls impersonating President Biden before the January 2024 New Hampshire primary.

  • The calls falsely told voters they should skip the primary and “save your vote for the November election,” potentially suppressing turnout in a critical early contest.
  • If convicted, Kramer could face decades in prison under New Hampshire laws prohibiting voter suppression and candidate impersonation.

The defendant’s argument: Kramer claims he orchestrated the calls as a “wake-up call” about AI dangers rather than to influence the election outcome.

  • He paid a New Orleans magician $150 to create the recording, which mimicked Biden’s voice and used his characteristic phrase “What a bunch of malarkey.”
  • “Maybe I’m a villain today, but I think in the end we get a better country and better democracy because of what I’ve done, deliberately,” Kramer told the AP in February.

Legal complexities: The trial involves unusual questions about whether New Hampshire’s primary was legitimate given Democratic National Committee actions.

  • Judge Elizabeth Leonard ruled that jurors may consider the DNC’s attempt to dislodge New Hampshire from its first-in-the-nation primary position as relevant to Kramer’s intent.
  • The court will instruct jurors that legally the state did hold its presidential primary on January 23, 2024, though they aren’t required to accept this conclusion.

Broader implications: The case represents one of the first major legal tests of AI deepfakes in political campaigns.

  • Half of U.S. states have enacted legislation regulating AI deepfakes in political campaigns, according to watchdog organization Public Citizen.
  • Meanwhile, House Republicans recently added a provision to a tax bill that would ban states from regulating artificial intelligence for a decade, though the measure faces significant hurdles in the Senate.

Financial penalties: Beyond criminal charges, Kramer faces significant financial consequences.

  • The Federal Communications Commission has already fined Kramer $6 million, though it’s unclear whether he has paid the penalty.
Consultant behind AI-generated robocalls mimicking Biden goes on trial in New Hampshire

Recent News

VivaTech 2025 to showcase leadership and startups boosting AI across Europe

Technology teams, not CEOs, drive AI initiatives in over two-thirds of European firms.

Meta’s Aria Gen 2 smart glasses track biometrics to train AI robots

Meta envisions AI that remembers where you left your keys and sends robots to fetch them.