Bank scammers have started using artificial intelligence (AI) to replicate people's voice and thus carry out increasingly sophisticated frauds. With just a few seconds of audio recording obtained from videos or social media, Criminals can clone a victim's voice and use it to deceive family or friends, asking them for money on their behalf. This tactic has already affected hundreds of people, and experts warn that millions could be vulnerable to this type of scam.
According to a survey conducted by Starling Bank, a UK bank, More than a quarter of respondents have been targeted by voice-cloning fraud attempts in recent months. Most alarmingly, the 46% of people were unaware that these types of scams existed, and a 8% He said he would send money even if the call looked suspicious. The ease with which scammers can access personal audio makes this technique especially dangerous.
To protect yourself, Experts recommend people agree on a “Confident phrase” with loved ones, a kind of code that can be used to verify identity during calls. This is crucial, since cloned voices can sound extremely real, which makes it difficult to distinguish them from an authentic call. In addition, It is advised not to share this phrase by text message, as it could be intercepted.
The use of AI in fraud is on the rise, And their ability to replicate voices poses risks not only for bank scams, but also to spread misinformation or access personal accounts. Speech Synthesis Technology, such as the one developed by OpenAI, It has proven to be so advanced that it is kept under strict control to prevent misuse.
Fountain: Segu-Info