The Federal Communications Commission is cracking down on robocalls using voices generated by artificial intelligence to combat election fraud and phone scams, a move that follows a controversial robocall using an AI clone of President Joe Biden in New Hampshire.
The FCC ruled unanimously on Thursday that using AI to generate voices in a robocall is considered illegal. Using AI to engage in a scam or fraud was already illegal.
These unsolicited calls are used “to threaten election integrity, harm public safety, and prey on the most vulnerable members of our society,” FCC Commissioner Geoffrey Starks said in a statement.
“We’re putting the fraudsters behind these robocalls on notice,” FCC Chairwoman Jessica Rosenworcel added.
The ruling came weeks after New Hampshire voters received calls from an unknown party playing a fake voice recording of Biden encouraging them not to vote for him in the primaries. The recordings were tracked to two companies in Texas, Attorney General John Formella said at a news conference announcing a criminal investigation on Tuesday, and a cease-and-desist letter was sent to them.
CLICK HERE TO READ MORE FROM THE WASHINGTON EXAMINER
Americans are the target of more than 4 billion robocalls a month, according to an index by YouMail. The FCC works with 48 states to crack down on robocalls and bad-faith actors.
AI-created clones have already been used to try to trick people into paying excess fees and even paying ransoms for family members they thought were abducted. One mother went before Congress to claim that scammers created an AI-generated clone of her daughter’s voice to trick her into paying a $50,000 ransom.

