Search
Close this search box.
Cyber Security News

McAfee report finds cybercriminals can steal voices through AI

identicon
3 min read
Share
McAfee report finds cybercriminals can steal voices through AI

McAfee Corp has published a report on how artificial intelligence (AI) technology is fueling a rise in online voice scams. 

McAfee, who conducted a thorough examination of the emergence of AI voice imitation technology and its use by cybercriminals in its report “The Artificial Imposter”, revealed that 61% of Australians do not feel confident distinguishing between a genuine voice and one generated by AI.  

The company states that this is a significant issue as fraudsters are increasingly exploiting AI to mimic the voices of people known to their targets, with the goal of defrauding ordinary Australians. 

The ACCC’s recent 2022 Targeting Scams Report  found that scams in general are on the rise, with Australians losing at least $3.1 billion in 2022, an 80% increase on total losses recorded in 2021. 

Check out: Australians lost more than $3bn to scammers in 2022. Here are 5 emerging scams to look out for 

As artificial intelligence tools become more prevalent and widely used, it has become increasingly simple to manipulate various forms of media, including images, videos, and particularly concerning, the voices of individuals.  

According to McAfee’s investigation, cybercriminals are employing AI technology to replicate voices and send fraudulent voicemails or voice notes to people in a victim’s network, posing as someone in distress. 

“It’s clear that advanced artificial intelligence tools have already changed the game for cybercriminals,” Tyler McGee, Head of APAC at McAfee said. 

“Now, with very little effort, they can clone a person’s voice and deceive a close contact into sending money.” 

Although Gen Z is often regarded as the most tech-savvy generation, McAfee’s study discovered that they are the least responsible and most vulnerable when it comes to voice cloning.  

62% of 18-29-year-olds acknowledged sharing their voice online at least once a week. Additionally, 66% of those aged 18-29 stated they are not worried about the increase in false information or disinformation on the internet. 

Despite being relatively cautious with their online behavior, Australians aged 50-65 continue to be a significant target for scammers. Nearly half of those within this age bracket are completely unaware of the existence of AI and voice cloning scams. ACCC’s report indicated that almost $100 million in losses from cyber scams was sustained by this demographic. 

Check out: Consumer watchdog sees sharp spike in Optus-related scams 

McAfee Labs’ security researchers investigated the accessibility, ease of use, and efficacy of AI voice-cloning tools.  

The team found that more than a dozen of the tools were freely available on the internet, with many requiring only a basic level of experience and expertise to use. In one instance, three seconds of audio was enough to produce an 85% match. 

By training the data models, McAfee researchers were able to achieve a 95% voice match based on just a small number of video files. 

The research team developed various AI-generated voice clones that were modeled after an Australian accent. Using accessible tools, they were able to improve the accent and produce a highly accurate and convincing result. 

The more realistic the voice clone is, the more likely a cybercriminal is to deceive someone into providing them with money. By exploiting the emotional weaknesses inherent in close relationships, these scams may net thousands of dollars in only a few hours. 

The research team was of the opinion that artificial intelligence has fundamentally altered the game for cybercriminals. The barrier to entry has never been lower, which means that perpetrating cybercrime has become simpler than ever before. 

“It’s important to remain vigilant and to take proactive steps to keep you and your loved ones safe. Should you receive a call from your spouse or a family member in distress and asking for money, verify the caller – use a codeword, or ask a question only they would know,” McGee said. 

“Identity and privacy protection services will also help limit the digital footprint of personal information that a criminal can use to develop a compelling narrative when creating a voice clone.” 

Download McAfee’s latest report and learn how to stay protected against AI voice cloning scams.

 

[code_snippet id=8]

Website | + posts

Eliza is a content producer and editor at Public Spectrum. She is an experienced writer on topics related to the government and to the public, as well as stories that uplift and improve the community.

Tags:

You Might also Like

Related Stories

Next Up