|

Something Doesn’t Sound Right: Advisors Grapple With AI ‘Vishing’ Scams

A black phone.
Photo by Julian Hochgesang via Unsplash

Sign up for market insights, wealth management practice essentials and industry updates.

Clearwater Capital Partners received a voicemail one weekend from what sounded like a client abroad, urgently requesting a fund transfer. Something felt off. The client had never asked for such a thing, and he referred to himself by his spouse’s name. “It was the wife’s name with the husband’s voice,” said Jeff DeHaan, the firm’s managing partner.

Imitation, the so-called sincerest form of flattery, is a time-tested tactic for scams, and AI makes it simpler than ever. While artificial intelligence helps advisors reduce costs, speed production, and communicate with clients, it’s also being exploited by scammers to defraud wealth managers and their customers. One emerging threat is “vishing,” where fraudsters use AI to mimic voices to authorize fraudulent transfers.

“It’s not enough just to recognize the voice,” DeHaan told Advisor Upside, adding that advisors need strong verbal verification processes to authenticate a client’s identity. “The reality is you can’t use the last four digits of their Social Security number or their spouse’s birthday anymore. The things that we’ve always used to make sure a person is who they say they are, don’t work.”

AI Got Your Tongue

AI voice cloning is surprisingly easy, said Freedom Dumlao, CTO at wealthtech firm Vestmark. “The barrier to entry has completely collapsed,” he told Advisor Upside. “It used to be that voice cloning required state-level resources; now anyone with a credit card can do it.” 

The scam goes both ways, Dumlao said. Clients should be prepared to encounter fishy phone calls supposedly from their advisors. Scammers only need a short sample from social media, YouTube, or a phone call. “If I want to capture their voice print and create a clone, all I have to do is call up an advisor and say, ‘Hey, I’m thinking about moving my account,’ and I can have them on the phone for 30 minutes,” Dumlao said.

AI-generated content has become quite convincing:  

  • A 2025 study in Scientific Reports found participants mistook AI-generated voices for the real person 80% of the time and correctly identified AI voices only about 60% of the time. 
  • In a 2023 McAfee survey of people around the globe, 10% of respondents said they had received AI voice clone messages, and 77% of those lost money.

Hanging on the Telephone. Protecting clients from vishing doesn’t require high-tech solutions, however. If a supposed client asks for an urgent transfer, hang up and call the number on file. “That out-of-band verification is simple and goes through a channel you know is your client,” Dumlao said, adding that clients should be told advisors will never call urgently requesting money.

Firms should also implement unique authentication questions that only clients and advisors know. (We prefer the time tested: What was your high school’s mascot?) DeHaan even recalled a pre-AI case where fraudsters set up call forwarding on a client’s line. “The phone would ring once but then stop,” he said. “You can’t even trust the callback number.”

Sign Up for Advisor Upside to Unlock This Article
Market insights, practice essentials, and industry updates.