ChatGPT said: AI Voice Scam Alert: Scammers Now Clone Your Loved Ones’ Voices to Steal Money and Info

A New Kind of Scam Is Here. In a world driven by fast-growing technology, scammers have found a new tool to
trick people – your own voice. Imagine receiving a frantic call from your mom or brother, or any close friend.

Their voice sounds real, maybe even emotional or panicked. But here’s the frightening part: it is not them at all; a scammer using artificial intelligence has mimicked their voice. Such fraud is becoming common in society today, and all of us must know about this one.

How Do Scammers Clone Your Voice?

For scammers to produce a fake voice that sounds like you, they only need a very short audio sample, even less than whatever is currently on the market. Such a sample could be anything from your social media posts or a voice note. AI then studies your actual speech and generates a counterfeit version of it.

Once your voice is in the hands of the scammers, they will put it to work. Talking to your family or friends about giving them money or personal details is entirely plausible. Since the voice sounds so real, they usually get duped.

Being Tricked Around Real Folks

Indeed, there have been some real-life stories where families fell victim. In these scams, the cloned voice often adds emotions- crying, urgency, or fear to increase the pressure. Scams like these could feel very real, which is precisely what makes them so dangerous.

What Makes This So Scary

There was a time when calls were somewhat easier to tell apart. If the voice sounded strange or if the message was incoherent, then that was evidently a red flag. Nowadays, with AI capable of talking in your loved ones’ voices, the gray line is fading into obscurity. And given how much material we give out now — video messages, voice messages, and podcasts — we really do not bother being careful. That just helps scamsters follow our voices.

How to Protect Yourself

Here are some easy precautions you can adopt:

Be careful about what you release online.

Do not post many audio clips of your voice or personal videos.

Establish a password. Map out a secret word or phrase within a close family or friend group so that if someone calls in an emergency, the use of that word will confirm their true identity.

So, first find out and assure yourself. Then, hang up on an emotional call that sounds strange and call the number the person really uses next.

Tell your family about it. This way, no one will be fooled by your relatives if such a scam reaches them.

Be Aware and Inform Others

AI is a winner. It can do great things, but it also has begun to manifest great dangers. This is the way to resist such dangers: stay informed and inform others about them. Share it among your family and friends, and be careful with urgent or emotional voice calls.

Exit mobile version