Skip to content   Skip to footer navigation 

AI voice scams: What you need to know

With the latest phone scams, it's even harder to spot fact from fake.

scammer_using_headset_and_laptop
Last updated: 26 August 2024
Fact-checked

Fact-checked

Checked for accuracy by our qualified fact-checkers, verifiers and subject experts. Find out more about fact-checking at CHOICE.

Need to know

  • Criminals are using AI to impersonate the voices of people you may know to try to convince you to part with your money
  • Australians are being targeted by this new type of scam, which has already wreaked havoc overseas
  • AI is improving quickly, so paying close attention to the content of a call or voice message rather than the voice quality can help you identify this scam

They've caused chaos and distress overseas, but AI voice scams are now being deployed against Australians in attempts to steal our money and personal information.

Harnessing the latest technology to impersonate the voices of our loved ones or prominent people, these novel cons have created headlines in the US for getting around the defences of even tech-savvy scam-avoiders.

We're breaking down the types of AI voice scams you could encounter and how to spot and avoid them.

What are AI voice scams?

First emerging overseas last year, many AI voice scams appear to be an evolution of the text message-based "Hi Mum" scam which gained notoriety in Australia in 2022.

This scam saw criminals contact victims by text, pretending to be a family member (often a child) in urgent need of money after losing their phone.

But the power of AI has now enabled scammers to take their impersonations to the next level, as Toby Murray, associate professor in the school of computing and information systems at the University of Melbourne, explains. 

"AI voice-cloning technology allows you to mimic someone's voice pretty closely and [it's] getting good enough now that the results are becoming almost indistinguishable," he says.

concerned_senior_on_phone

Scammers can use voice clones of a loved one in trouble to convince victims to hand over money.

"You can create a convincing, false recording of someone's voice that could then fool a family member into thinking that it's their loved one," Murray explains.

These recordings are then deployed against victims through phone calls or voice messages on social media apps.

But it's not just a family member or friend's voice you might hear on the other end of the line – investment scammers are also deploying this method, enlisting the voices of famous individuals to spruik get-rich-quick schemes and convince victims to hand over their savings.

Are AI voice scams happening in Australia?

financial stock market graph on screen

The latest AI voice scams targeting Australians involve fake investment schemes.

Lending prominent voices to fraudulent investment schemes has been one of the first areas where AI voice scams have been leveraged against Australians.

When Sydney-based advertising executive Dee Madigan received a message from a social media account pretending to be Queensland Premier Steven Miles, she knew it was a scam.

"Steven's a friend of mine, so I knew it wasn't him, [but] he had a fantastic idea about an investment," she recalls.

Wanting to test the would-be scammer, Madigan asked them to call her, thinking it would put an end to the interaction. What happened next surprised her.

"All of a sudden, my phone suddenly rang and on the other end was his voice," she recalls.

Over the course of a brief call and a follow-up audio message, the voice claimed to be too busy to talk, but promised to send her more information about the money-making opportunity.

Sydney-based advertising executive Dee Madigan received a call from a scammer who had used AI to clone the voice of Queensland Premier Steven Miles

"It was surprisingly good," Madigan says of the voice. "[It was] slightly robotic, but much better than I thought it would have been. It definitely did sound like Steven."

Miles' office didn't respond to a request for comment, but has confirmed the incident was a scam in previous statements.

As for other cases, including instances where distressed family members were impersonated, AI-powered scams still appear to be in their infancy in Australia.

The ACCC's National Anti-Scam Centre says it's received fewer than five reports where it suspects scammers have used AI technology to clone voice or video since 2022.

person_making_payment_on_smartphone

Australian banks are warning customers to look out for AI voice scams.

But experts and business say the threat is growing, with the National Australia Bank (NAB) putting AI voice clones at the top of its list of scams likely to be targeting Australians this year.

"We really should expect that, because this voice cloning technology is getting so good," agrees Murray.

"People have started to become much more aware of traditional [text message] scams and I think scammers know this, so we should expect … that scammers are going to adopt [AI voice] technology."

AI voice scams already causing devastation overseas

In recent cases in the US, people have paid hundreds of thousands of dollars in ransom to scammers after being confronted with elaborate schemes where AI clones of a loved one's voice convinced them that a family member had been kidnapped.

America's consumer watchdog, the Federal Trade Commission, is warning consumers to beware of AI voice cloning and earlier this year ran a competition to try to find ways to protect consumers from these scams.

How AI voice scams work

audio_editing_software_on_screen

The best AI tools only need seconds of your voice to create a clone.

A convincing voice clone can be made with mere seconds of original audio – something scammers can recover from sources such as videos on social media.

Once they've made a clone, scammers will call targets with a pre-recorded message spoken in the cloned voice.

Dr Shaanan Cohney, researcher at Melbourne University's Centre for AI and Digital Ethics, says these messages targeting family members closely follow the "Hi Mum" progression.

"A common one is someone you know is in urgent need and needs you to make a transfer of some funds to a particular location," he explains.

"There'll be an excuse provided for why the funds are needed and also for why no further voice communication can happen. The goal is to minimise the opportunity for the person to identify that something's wrong with the voice communication."

As with many modern cons, the perpetrators of AI voice scams will often direct targets to send money via a gift card, cryptocurrency or bank transfer.

google search results for voice cloning

There's been an explosion of easy-to-use cloning tools in recent years. Image: Google

New services powering scams

The growing prevalence of sophisticated AI tools has delivered criminals the means to copy voices quickly for relatively little cost.

"The technology has crossed a threshold where the improvements have accumulated to the point where [it's] now very usable with very little effort by ordinary people, rather than just by experts," explains Murray.

Cohney says that although sophisticated computer software programs are available, he suspects most groups running this scam are using easy-to-find, web-based options.

"[If you] type 'voice cloning' into Google, [you] get 'AI voice cloning', 'Clone your voice in minutes', 'Free AI voice cloning in 30 seconds'. There are hundreds of these services now."

Our AI voice clone experiment

Earlier this year, CHOICE signed up to one such service. For only $US1.10 ($1.66) a month, we were able to clone existing voices or create new ones from scratch.

Wanting to replicate the process a scammer might follow, we uploaded a few smartphone recordings of this author's voice, ensuring the original files were of a similar quality to social media videos.

Running these recordings through the tool several times, we were able to make our clone read a message similar to one a scammer might play to a victim's loved one.

For only $US1.10 ($1.66) a month, we were able to clone existing voices or create new ones from scratch

We found the first versions of the voice we created would occasionally drop into an English accent. But after uploading further samples and changing the prompts to the cloning tool (instructing the engine on the gender, age and nationality of the original voice, for example) we were able to get a clone that sounded quite similar to the author's voice.

If you listen to a sample of the voice below, you might notice subtle changes in accent in some parts, as well as a lack of some of the emotion one might expect someone in a stressful situation to carry in their voice.

Press play to hear our AI voice clone

Can you tell if you're speaking to an AI clone?

AI experts say vocal cloning tools are now so good that a voice they create can be very difficult to tell apart from the original.

They instead recommend paying attention to what the caller is saying and listening for:

  • a sense of urgency
  • an unwillingness to explain things further
  • the absence of normal social cues 
  • missing signs of ordinary communication, such as if your loved one is not greeting you in a way they normally would.

"Notice if things seem out of the ordinary for your communication with this particular person," Cohney advises. "If something appears out of context, then it's a wise move to inquire further."

If you're hearing the voice in a phone call, asking questions or trying to extend the conversation is an easy way to catch an AI scammer in the act.

Most current cloning tools are only effective at delivering short, pre-prepared messages, and can't engage in spontaneous conversation.

Finally, as with SMS-based "Hi Mum" scams, scammers will likely call from an unknown or private number. 

Therefore, treat unusual calls that sound like they're from a loved one on an unknown number with suspicion and contact the person using details you have yourself to confirm what they're saying.

Can you protect your voice from AI?

video on social media

Videos of ourselves on social media can make our voices vulnerable to cloning.

Murray says the fact that many of us are sharing videos and audio of ourselves online means there isn't much we can do to stop our voices from being cloned.

"In the social media age, asking people not to post videos of themselves online is unreasonable," he says.

"Just as there is little you can do to prevent somebody creating a fake social media profile in your name, we are now entering an age in which our voices and faces can be impersonated by AI."

You can, however, adjust the privacy settings on your social media accounts so that videos or other sensitive posts can't be seen by people you're not connected with who visit your page.

"Being vigilant is the best defence," concludes Murray, highlighting the previously mentioned methods for identifying an AI voice scam as the easiest way to stay safe.

If you are particularly worried about AI scammers impersonating loved ones, agree on a codeword or question-answer exchange you'll recite if one of you ever has to call for help.

We're on your side

In more than 60 years of making a difference for Australian consumers, we've never taken ads or sponsorship.

Instead we're funded by members who value expert reviews and independent product testing.

With no self-interest behind our advice, you don't just buy smarter, you get the answers that you need.

You know without hesitation what's safe for you and your family.

And you'll never be alone when something goes wrong or a business treats you unfairly.

Learn more about CHOICE membership today

We care about accuracy. See something that's not quite right in this article? Let us know or read more about fact-checking at CHOICE.

Stock images: Getty, unless otherwise stated.