AI-generated voices in robocalls can deceive voters. The FCC just made them illegal

FILE - The seal of the Federal Communications Commission (FCC) is seen before an FCC meeting to vote on net neutrality, Dec. 14, 2017, in Washington. On Wednesday, Nov. 15, 2023, the FCC enacted new rules intended to eliminate discrimination in access to internet services, a move which regulators are calling the first major U.S. digital civil rights policy. (AP Photo/Jacquelyn Martin, File)

NEW YORK — The Federal Communications Commission on Thursday outlawed robocalls that contain voices generated by artificial intelligence, a decision that sends a clear message that exploiting the technology to scam people and mislead voters won’t be tolerated.

The unanimous ruling targets robocalls made with AI voice-cloning tools under the Telephone Consumer Protection Act, a 1991 law restricting junk calls that use artificial and prerecorded voice messages.

ADVERTISING


The announcement comes as New Hampshire authorities are advancing their investigation into AI-generated robocalls that mimicked President Joe Biden’s voice to discourage people from voting in the state’s first-in-the-nation primary last month.

Effective immediately, the regulation empowers the FCC to fine companies that use AI voices in their calls or block the service providers that carry them. It also opens the door for call recipients to file lawsuits and gives state attorneys general a new mechanism to crack down on violators, according to the FCC.

The agency’s chairwoman, Jessica Rosenworcel, said bad actors have been using AI-generated voices in robocalls to misinform voters, impersonate celebrities and extort family members.

“It seems like something from the far-off future, but this threat is already here,” Rosenworcel told The Associated Press on Wednesday as the commission was considering the regulations. “All of us could be on the receiving end of these faked calls, so that’s why we felt the time to act was now.”

Under the consumer protection law, telemarketers generally cannot use automated dialers or artificial or prerecorded voice messages to call cellphones, and they cannot make such calls to landlines without prior written consent from the call recipient.

The new ruling classifies AI-generated voices in robocalls as “artificial” and thus enforceable by the same standards, the FCC said.

Those who break the law can face steep fines, with a maximum of more than $23,000 per call, the FCC said. The agency has previously used the consumer law to clamp down on robocallers interfering in elections, including imposing a $5 million fine on two conservative hoaxers for falsely warning people in predominantly Black areas that voting by mail could heighten their risk of arrest, debt collection and forced vaccination.

The law also gives call recipients the right to take legal action and potentially recover up to $1,500 in damages for each unwanted call.

Josh Lawson, director of AI and democracy at the Aspen Institute, said even with the FCC’s ruling, voters should prepare themselves for personalized spam to target them by phone, text and social media.

“The true dark hats tend to disregard the stakes and they know what they’re doing is unlawful,” he said. “We have to understand that bad actors are going to continue to rattle the cages and push the limits.”

Leave a Reply

Your email address will not be published. Required fields are marked *

*

By participating in online discussions you acknowledge that you have agreed to the Star-Advertiser's TERMS OF SERVICE. An insightful discussion of ideas and viewpoints is encouraged, but comments must be civil and in good taste, with no personal attacks. If your comments are inappropriate, you may be banned from posting. To report comments that you believe do not follow our guidelines, email hawaiiwarriorworld@staradvertiser.com.