A high tech vishing attack utilising voice cloning has lost a UAE bank 35 Million USD.
According to the court documents:
the Victim Company’s branch manager received a phone call that claimed to be from the company headquarters. The caller sounded like the Director of the company, so the branch manager believed the call was legitimate. The branch manager also received several emails that he believed were from the Director that were related to the phone call. The caller told the branch manager by phone and email that the Victim Company was about to acquire another company, and that a lawyer named Martin Zelner (Zelner) had been authorized to coordinate procedures for the acquisition. The branch manager then received several emails from Zelner regarding the acquisition, including a letter of authorization from the Director to Zelner. Because of these communications, when Zelner asked the branch manager to transfer USD 35 million to several accounts as part of the acquisition, the branch manager followed his instructions.
Voice Cloning Attack
The bank manager was duped by the attackers who used “deep voice” AI technology – Also known as voice cloning – to clone the directors voice. This was then used during the vishing call to impersonate the director and convince the bank manager that he was in fact speaking to the client. By using a multi-vector attack consisting of a convincing pretext and emails to prepare for the call, the bank manager was expecting the call and had no reason to doubt its authenticity when it occurred.
This case has been revealed through the public records of the request made by the UAE to the US government to help track some $416,000 of the stolen funds which ended up in US bank accounts. You can read the application here
This is the second public case of a vishing attack utilising deep fake voice cloning, the previous social engineering attack caused a UK energy firm to lose $240,000 USD in 2019.
How voice cloning can be used in a Social Engineering
Vishing attacks using voice cloning often come in two categories; “conversation” consisting of a bank of audio clips which can be played on demand by the attacker, and real time audio conversion where the attackers voice is modified according to an algorithm to sounds like the victim.
The former approach is simpler as the bank of audio clips can be composed of genuine recordings of the victim, likely taken from online sources such as webinars or youtube videos. In addition to this AI technology can generate additional words, phrases and sentences which an attacker can then have, literally at his fingertips, to play to the target as required during the vishing attack.
In this case if the attacker had been able to listen in to VOIP calls and voicemail messages of the victim, they would be able to put together a suitable bank of clips which mirrored the vocabulary and mannerisms of the person being impersonated, making the voice cloning attack highly believable.
As tech companies improve the quality and speed of voice cloning deep fake technology, it is likely that this will be used more often in multi-channel high value attacks that utilising vishing and other forms of social engineering.
What can you do to prevent a voice cloning vishing attack on your organisation?
If you suspect that the caller is using a pre-recorded clips as responses to your dialogue, repeat the question and note if the response sounds markedly different to their previous response. Even where we say “yes” twice in a row there will be subtle differences in the sound. Ask open ended questions that will be more of a challenge to respond to from a script. The ideal position for such a call would be for the caller to take a more assertive, authoritative and controlling tone in a conversation so that they can direct the conversation and thus have more control over what they need to “play”. By asking questions and going “off topic” it is likely this approach will be undone. So if in doubt ask some off topic questions such as “what is the weather like where you are” or “I visited that city a while ago what’s that famous museum called again?”
The context of the call may feel odd or unexpected. The requests for information may feel overly persistent or aggressive even and the tone of the call may just leave you with a bad gut feeling. If this is the case, or anything leaves you feeling suspicious then as a general strategy for a vishing attack, if you are unsure that the caller is who they claim to be, do not give any further information to the caller. Give an excuse and finish the call as soon as possible. Alert your security team and attempt to contact the actual caller through an alternative method or phone number.