News & Analysis

AI-led Voice Scams on the Increase

Scamsters appear to be using artificial intelligence to close voice notes of internet users to perpetrate cyber scams in India

At a time when generative AI is raising the hackles of cybersecurity experts on its ability to take such crimes to the next level, a report says more than half of India’s internet users could fall victim to scams engineered by AI-led cloning of their voices. The new development is likely to further take the calls for AI governance to the next decibel levels. 

Earlier this month, McAfee said in a report that nearly half of India’s users were impacted, either directly or indirectly, by such voice-based cloning and the resultant crimes over the first quarter of 2023. The survey found that one in four adults had experienced such a scam with 77% of those targeted losing money in the process. 

The McAfee report said of the 77%, more than a third lost over $1000 while 7% were duped out of between $5,000 to $15,000. It also confirmed that the victims in the United States lost the most with 11% of them having lost up to $15,000 through AI voice cloning scams over the past few months. 

A majority of the victims felt that they shared their voices on social media platforms and other online spaces at least once a week. This was more than enough to bolster the effort of threat actors who pick up these threads and clone the same using artificial intelligence tools and then use the same as deep fakes to pull off scams. 

What is voice cloning and how is it used for scams?

The voice cloning scam uses artificially generated audio files to dupe victims into thinking that their loved ones are in need of money and are seeking help. The scamster usually runs a clip of a person talking with clips scraped from social media. These are run through AI voice generators that use machine learning to analyze the cadence and pitch of the original clip.  

This, then is used by the threat actors to create original audio that mimics the subject’s voice near perfectly. The voice clip is then sent to friends and relatives of the person via social channels, including WhatsApp in the hope that they cannot distinguish between their friends and an AI-generated voice resembling it. 

It’s worth noting that voice cloning is possible only via an AI voice generator that turns text files into speech. They use machine learning to teach themselves to speak in specific ways through an analysis of data from audio files. Readers may have come across AI-generated voice overs in some office presentations or short video clips that mimic various accents. 

Can we differentiate between real and artificial? 

The survey said close to three-quarters of the respondents said they were unsure about telling the difference between a human voice and its AI-led imitation. In fact, the survey said a third of all Indians said they were likely to respond to a voice query seeking financial help. This number dropped to 33% in Japan and France, 35% in Germany and 37% in Australia. 

What’s more, Indian users also topped the list of those regularly sharing some form of voice notes on social media, be it in the form of short videos or notes in messaging groups. McAfee CTO Steve Grobman reiterated that it was the availability and access to advanced AI tools that was changing the game for cybercriminals.  

“Instead of just making phone calls or sending emails or text messages, a cybercriminal can now impersonate someone using AI voice-cloning technology with very little effort. This plays on your emotional connection and a sense of urgency, to increase the likelihood of you falling for the scam,” Grobman said in a statement. 

So, what’s the best way to stay safe?

Cybersecurity experts suggest that voice notes on messaging platforms should never be taken seriously. The best way is to ask the sender to call you from the nearest phone or location. For, if the internet is working, there’s every likelihood that one can get one’s hands on a landline phone or even a cellphone. 

And in case one cannot make contact with the alleged person who needs urgent money, the next best thing is to seek the help of enforcement. As for those who haven’t yet received any such calls, the best way is to have codes to identify your loved ones and yourself. This technique is as old as James Bond movies and still works like a charm! 

Leave a Response