How AI-enabled deepfake voice clones are used in modern theft

Scam calls asking people to move their money somewhere have existed for ages. Usually, these calls are from unknown people, therefore, it’s easy to spot one. But, if your CEO called you to finalize an agreement or sign a deal, would you do it? If yes, this is how AI-enabled deepfake voice clones are used in modern theft

 

Deepfake – AI that manipulates audiovisual media

Artificial Intelligence, which has revolutionized the world in so many ways is nothing new to anyone. With almost every industry in the world starting to embrace AI technology, there have been huge advancements in the field. Out of many of these, there is one that has been challenging cybersecurity experts quite a lot. Deepfake – a relatively new implication of AI is used to manipulate audiovisual media.

 

 

 

Deepfakes keep making it to the headlines frequently. Most of the time, it is because of a manipulated video of some politician’s speech, a big tech CEO’s statements, or a fake explicit video featuring a popular actor. This time, however, the cause is different and unusual. According to a recent report from NISOS – a security consulting firm, there has been one recent attempted fraud involving synthetic audio. Scammers have found a new way to conduct their activities thanks to deepfake voice clones.

 

 

This audio snippet was shared by NISOS with Motherboard. The quality here isn’t very good. And, anyone reading this article can be suspicious about its genuineness. This robotic voice might not convince many but to a certain degree, it sounds human. When asked whether this voice clip sounds more robotic or human, NISOS researcher Rob Volkert told Motherboard, “I would say more human.” He added, “But it doesn’t sound like the CEO enough.” The failed attack couldn’t convince the employee who got the message in his voicemail. This attack failed, however, it raises many questions regarding the kind of attacks that might arise in the future.

 

Audio deepfake scams in the past

There have been some reported attempts of similar scams in the past. The most famous of these was a scam on a UK energy firm that happened in September 2019. Scammers tricked the firm’s chief executive into sending $240,000 to a Hungarian supplier. This involved a fake phone call supposedly from the CEO of the company’s parent firm located in Germany. The executive sent the money after being told the transfer was urgent and had to be done within an hour.

What to expect?

Just like AI technology, deepfakes are getting better at an exponential rate. Soon, they can theoretically reach a level where separating fakes will likely be impossible for a human mind. Till now, the audio quality in these scams isn’t good enough. But, when it comes to deepfakes, the more data you have, the better results you get. And, gathering audio recordings of CEOs and other people in higher posts isn’t very difficult. There are many interviews, meetings, and speeches that can prove golden for scammers who have access to deepfake software. With time, the audio quality of these fakes will get more realistic and the success rate of these scams is likely to increase. Mostly, people who can easily be manipulated will be the likely target of these scams.

What to do if you get these suspicious calls?

Getting calls from CEOs and higher-level officers can be exciting, but not when they’re fake. Anyone can get these types of calls in the future. Most of these scams use burner VoIP accounts to call their targets. So, the best thing to do in this case is to hang up the phone and call back. Similarly, with voicemail, you can get confirmation from concerned authorities and carry on with tasks.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top