

California Attorney General Rob Bonta on Friday warned Californians against scams that use artificial intelligence or “deepfakes” to impersonate government officials, distressed family members or other trusted figures.
Emerging technology such as AI and deepfake video or voice manipulation enables scammers to more easily create sophisticated impersonations and as a result make more convincing requests for money or personal information, according to the Attorney General’s Office.
“Scammers can use information available on the internet, including images and audio from social media, to convince people that the voice on the other end of the call is someone they can trust,” according to the AG’s announcement.
Criminals can duplicate an individual’s voice with AI technology that uses audio clips from social media accounts and can refer to personal information about the fraud victim obtained on the internet, officials said. These factors make the scam appear credible.
“Scammers are often quite literally in our pockets, just a phone call, social media message, or text away,” Bonta said in a statement. “AI and other novel and evolving technologies can make scams harder to spot. Knowing what to look for is an important way to keep consumers safe against these tactics. I urge Californians to take practical steps to guard against being victimized by scammers, including talking to friends and family who may be unaware of these dangers.”
One recent scam targets parents by sending AI voice impersonations of their child begging for help, according to the AG’s Office.
“Recent reports have included parents receiving a phone call using the cloned voice of their child claiming to have been badly injured in a car accident or in need of money to pay bail,” officials said. “Grandparents are often the target of scams claiming that their grandchild is in trouble and in need of money. In 2023, the FBI received victim complaints regarding grandparent scams that resulted in nearly $1.9 million in losses.”
Scammers often target consumers’ mobile phones. In 2023, robocalls and robotexts resulted in more than $1.2 billion in reported losses nationwide, officials reported. Scammers’ smartphone targeting also gives them access to potential victims’ email, social media and the internet presence.
“These phone-based scams are designed to steal money, identities, or passwords, or urgently demand payment through cash or gift cards,” according to Bonta’s office. “Scams can result in significant financial losses, ruined credit scores, and impacted security clearance for service members and others.”
While younger adults reported losing money to fraud more often in 2023 than older adults, older adults who lose money tend to lose larger amounts, state officials said. In 2022, American adults 60 and over lost $1.6 billion to scams, the U.S. Federal Trade Commission reported.
Imposter scams were the most commonly reported fraud in 2023, according to the FTC.
“These imposter scams often involve a bad actor pretending to be a bank’s fraud department, the government, a well-known business, a technical support expert, or a distressed relative, such as a kidnapped child,” state officials said. “Other common phone-based scams include calls related to medical needs and prescriptions, debt reduction, utilities, bank fraud warnings, warranties, or IRS notices.”
AI-generated scams can also spread political misinformation. In January residents of New Hampshire received robocalls that allegedly used AI to impersonate the president and discourage voters from participating in the New Hampshire primary, state officials pointed out.
Bonta’s office suggested these measures of protection from phone-based scams:
Bonta noted a recent crackdown on robocalls, including AI-generated robocalls and robotexts.
He asked the FCC in January to address AI-generated robocalls, and the agency has since declared the voice-cloning technology commonly used robocall scams illegal under the Telephone Consumer Protection Act. In February, Bonta and 51 other attorneys general sent warning letter to Life Corp., which allegedly sent New Hampshire residents robocalls during the state’s primary election that allegedly used AI to impersonate President Joe Biden and discourage voters from participating in the primary.
In May 2023, Bonta and 48 other attorneys general sued Avid Telecom for allegedly made billions of unlawful robocalls that included scams involving the Social Security Administration, Medicare and employment, according to the Attorney General’s Office.
In January 2022 as part of a bipartisan multistate coalition, Bonta called on the FCC to halt “the flood of illegal foreign-based robocalls that “spoof” U.S. phone numbers.” In August 2022, the attorney general announced the bipartisan nationwide Anti-Robocall Litigation Task Force “to investigate and take legal action against the telecommunications companies responsible for bringing a majority of foreign robocalls into the U.S.”
For more information and resources on phone-based scams, visit oag.ca.gov/consumers.
Updated June 3, 2024, 9:53 a.m.
We are able to provide high-quality political journalism to you for free thanks to our advertisers. So that you can continue to enjoy HEYSOCAL's in-depth reporting, we ask that you please turn off your ad blocker and come on in, free of charge.
Subscribe to our newsletter for this giveaway and many more. Also, stay in the loop for SoCal news and updates.
Your subscription has been confirmed. You've been added to our list and will hear from us soon.
Your request has been confirmed! We will get in touch with you shortly.