Targeting the elderly.. Artificial intelligence voice cloning scams are of great concern

A woman was worried after she heard the voice of her daughter crying in a phone call in which a man claimed that he had kidnapped the girl and was asking for a ransom, but the tone of the girl’s voice was not real but generated by artificial intelligence technology in an attempt to defraud the mother; It is a worrying issue that emerges with this technology that is booming at the current stage.

Experts point out that the most significant danger of artificial intelligence is the technology’s ability to blur the line between fact and fiction, providing criminals with effective and inexpensive tools. Recent phone scams using AI-based voice transcription tools readily available on the Internet have alarmed US authorities.

Jennifer DiStefano, an Arizona mom, over the phone, heard a voice saying, “Help me Mom, please help me.” The mother believed that it was the voice of her fifteen-year-old daughter, who went out to practice skating. The mother said to a local TV station last April, “The sound was identical to my daughter’s voice with the same way she cried. I did not doubt for a moment that it might not be her.”

The fraudster, who called the mother from an unknown number, asked for one million dollars in exchange for the girl’s release.
This incident – which quickly came to an end after DiStefano was able to communicate with her daughter – is currently under investigation by the relevant authorities, and had a role in shedding light on possible fraud due to the use of cybercriminals by artificial intelligence programs.

Disguised deepfakes

CEO of Blackbird Ai, told Agence France-Presse that “audio transcription through artificial intelligence, which has become almost impossible to distinguish from the human voice, allows ill-intentioned people to obtain information and sums from victims.” way more efficient than they usually rely on.”

Many free and online applications allow the reproduction of a person’s actual voice through an artificial intelligence program based on a short recording of that person’s voice, and the fraudster can obtain similar recordings from the victim’s content posted online.

“Artificial intelligence can create a voice reproduction in the English language through a brief audio recording, which can be used to leave messages and audio clips that appear to be for the victim.” The reproduced voice may also be used as a modified voice during live calls, and fraudsters use various accents, imitating the way the victim speaks,” Khaled says. “, stressing that this technology “allows the creation of masked deep falsifications.”

A survey of nearly 7,000 people in 9 countries, including the United States, showed that one person out of every 4 was targeted by an artificial intelligence fraud attempt, or knew someone who had been subjected to a similar operation. And 70% of those surveyed indicated that they were not sure of their ability to distinguish between the real sound and the reproduced sound, according to the survey, which was published last month.

And the US authorities recently warned of an increase in “frauds affecting the elderly,” and the Federal Trade Commission said in its warning, “You receive a call in which your grandson’s voice is heard, and he is terrified, saying that he is in great trouble after he was in a traffic accident and detained by the police, but you can help him through Send him money.

Many elderly individuals have commented under the US committee’s warning, stating that they have also been deceived similarly.

Increased scams

A grandfather who had been defrauded was so convinced of what he heard that he started collecting money and thought of mortgaging his house before it became clear that what was happening was nothing but fraud.

The ease with which audio is reproduced means that “every Internet user is at risk,” Hani Farid, a professor at the University of California, Berkeley’s School of Information, told AFP, noting that “these scams are on the rise.”

- Advertisement -

The startup, ElevenLabs, had to admit that its AI voice transcribing tool could be used for malicious purposes after several netizens posted a fake clip of actress Emma Watson reading excerpts from Adolf Hitler’s “Mein Kampf” book.

“We are rapidly approaching the point where we can no longer trust content posted on the Internet, and we will have to use new technologies to make sure that the person we believe That we talk to him (on the phone) is the one we communicate with.”

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments
spot_img

Hot Topics

Related Articles