FTC Takes Steps to Protect Consumers from AI Deepfakes

The Federal Trade Commission last week announced it was seeking public comment on a rule that would prohibit the impersonation of individuals while also announcing a final rule prohibiting scammers from impersonating businesses or government agencies. The rulemakings are intended to combat the use of tools like artificial intelligence that are being used to create deepfakes and impersonate celebrities and others. The FTC noted that consumers lost more than $10 billion to fraud in 2023, the first time that fraud losses have surpassed that benchmark.

A supplemental rule on banning the use of impersonating individuals is a result of the comments that were filed on the proposed rule banning the impersonation of businesses and government agencies, the FTC announced. Scammers are using technology to defraud consumers by impersonating the voices and personalities of individuals they know — such as family members and friends.

The FTC is also considering holding the companies or platforms that allow for the creation of impersonated images, voices, or videos liable if they know or have reason to know that the tool is being used to harm consumers.

Under the new rule prohibiting the impersonation of businesses and government agencies, the FTC will be able to seek monetary relief in federal court from scammers that:

  • Use government seals or business logos when communicating with consumers by mail or online.
  • Spoof government and business emails and web addresses, including spoofing “.gov” email addresses or using lookalike email addresses or websites that rely on misspellings of a company’s name.
  • Falsely imply government or business affiliation by using terms that are known to be affiliated with a government agency or business (e.g., stating “I’m calling from the Clerk’s Office” to falsely imply affiliation with a court of law).

The rule will go into effect 30 days after it is published in the Federal Register.

Learn More.