Bandits steal $243,000 with deepfake audio mimicking CEO voice
In this year's March, the director of a German energy company located in the United Kingdom received a call that surprised him. In an energetic and hurried tone, the company's CEO ordered an immediate transfer of $243,000 to a current account in Hungary. The reason, he argued, was because of the deadline for paying a last-minute fee.
Because of the urgent tone of voice, the director made the transfer. A few moments later, his phone rings again and the CEO asks him to send another one. Suspicious, he hung up the phone and contacted directly the executive's personal number, and he denied having made the request. While on the call, he received another call from the "false boss", which confirmed the suspicion: the voice of the professional had been manipulated digitally.
Increasing use
The case came out in late August and was presented by the French Euler Hermes, insurance company hired by the German company (which was not identified) responsible for covering all costs lost with the scam. The case shows how advanced the voice simulation technology is. That, just like the deepfakes (a technique that uses artificial intelligence to combine a certain speech with an existing video) worry the companies because of the security issue.
And the German firm is not alone: researchers from the technology company Symantec told the Washington Post that they had found at least three other cases in which the voice of executives was imitated to deceive companies, resulting in losses of millions of dollars.
Persuasion techniques
This simulation can be made when, after receiving a recording of a person's voice, the system "breaks" the sound into small pieces and syllables, which can be reorganized to form sentences at the same rhythm of speech and tone. Currently, this is a market in which several companies are working, especially Lyrebird, a startup focused on the production of artificial voices.
At first, the goal of technology is to help humanize the voice systems of services or the creation of mechanisms that improve the lives of people with speech difficulties. But the development of this resource is in the sights of security companies, who believe that it should be more restricted.
However, experts say that the system is far from perfect and that "fake voices" would not fool a listener under normal conditions. What happens, in cases of scam, is the use of other intimidation techniques, such as pressure or hierarchical position, to prevent the listeners from paying attention to the sound in the background, slow responses and metallic noises.
Reproduced from: https://computerworld.com.br/2019/09/06/bandidos-roubam-us-243-mil-com-audio-deepfake-imitando-voz-de-ceo/
Translated from: https://www.perallis.com/news/bandidos-roubam-us-243-mil-com-audio-deepfake-imitando-voz-de-ceo