We have updated our Privacy Policy and Terms of Use for Eurasia Group and its affiliates, including GZERO Media, to clarify the types of data we collect, how we collect it, how we use data and with whom we share data. By using our website you consent to our Terms and Conditions and Privacy Policy, including the transfer of your personal data to the United States from your country of residence, and our use of cookies described in our Cookie Policy.
Ahead of the New Hampshire presidential primary, many voters got a suspicious robocall from Joe Biden urging them not to vote. Perhaps unsurprisingly, it was an AI-generated version of his voice custom made to confuse voters.
Now, the Federal Communications Commission wants to make AI-generated voice-cloning calls illegal under the Telephone Consumer Protection Act.
“AI-generated voice cloning and images are already sowing confusion by tricking consumers into thinking scams and frauds are legitimate,” FCC chairwoman Jessica Rosenworcel wrote in a statement. “We could all be a target of these faked calls,” she warned.
While AI has been used to make images and videos used in political advertising this election cycle, deepfake voices — especially over telephone — are arguably tougher to detect. Everyone sounds a little weird over the phone, right?
The FCC, wanting to act promptly amid primaries and before this November’s election, is set to vote on the proposed rule change in the coming weeks.