Scammers scam people by digitally copying the voice of their relatives

Scammers are using software that replicates a voice using artificial intelligence to scam people by pretending to be someone close to them over the phone. Scammers are using artificial intelligence…

Scammers are using software that replicates a voice using artificial intelligence to scam people by pretending to be someone close to them over the phone.
Scammers are using artificial intelligence to scam people by replicating the voice of a loved one over the phone. According to an article in the Washington Post on Sunday, a couple in their 70s in Canada received a call from a man whose voice sounded like their grandson’s, informing them that he was in jail and needed money for bail.

Without missing a beat, the grandparents withdrew 3,000 Canadian dollars from the first bank and then went to a second bank to withdraw more. Then the bank manager informed them that another customer had already received a similar call, and it turned out that the voice on the other end of the line had been faked. The couple then realized that they were about to be scammed.

Money evaporated

Another couple was less fortunate. Two parents received a call from a lawyer telling them that their son had killed a U.S. diplomat in a car accident, was in jail and needed money for legal fees. The lawyer then put their son on the phone to say that he needed $21,000 Canadian.

Although the call was unusual, the parents, convinced that they had traded with their son, complied and withdrew the money at several banks. They transmitted it to the lawyer via a bitcoin terminal. But then the couple was confused when they received a real call from their son, telling them that everything was fine. Despite alerting the police, the parents were unable to get their hands on their money.

A 30-second sample is enough

According to the Washington Post, artificial intelligence makes it very easy for scammers to reproduce a person’s voice from an audio sample of a few sentences. A 30-second sample is enough to clone a voice. Scammers can then make it say whatever they type on the keyboard. They most often use it to make people believe that their loved ones are in distress.

Justice and law enforcement agencies are still helpless in the face of this type of scam. Most victims have few leads to identify the perpetrator and it is difficult for the police to trace the calls. As for the courts, there are too few legal precedents to hold the companies that develop these tools responsible for their use, explains the American newspaper.

Among these companies, ElevenLabs, a text-to-speech start-up founded in 2022, has been used to make celebrities say anything, such as Emma Watson reciting passages from Adolf Hitler’s “Mein Kampf”. At the end of January, the company announced to strengthen the guarantees against abuses, including the launch of a tool to detect a voice generated by the AI.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *