AIHeadline

How To Protect Yourself from AI-Powered Phone Scams

×

How To Protect Yourself from AI-Powered Phone Scams

Share this article
Credit: Getty Images

Uzone.id – AI is indeed useful for many things, both positive and negative. Of all AI’s capabilities, the most frightening in our opinion is its ability to mimic the voices of famous people.

It’s never been easier to mimic a human voice. From sampling a voice, processing it, to making it say anything according to a given prompt. Surely you’ve seen on social media famous figures like Elon Musk saying things that they would never actually say.

- Advertisment -
- Advertisment -

This cloning feature is getting better and better at imitating other people’s voices. Like OpenAI, which showcased a project demonstrating how easy it is to clone a voice with just a 15-second recording.

The project is called Voice Engine which was first developed in late 2022.

“Voice Engine uses text input and a single 15-second audio sample to generate natural-sounding speech that is very similar to the original speaker. It is worth noting that a small model with a single 15-second sample can produce emotional and realistic voices,” Open AI wrote on its official website.

By Open AI, Voice Engine is used to support preset voices available in the text-to-speech API (opens in a new window) as well as ChatGPT Voice and Read Aloud. Indeed, this feature is not yet available to general users, and OpenAI is also preparing security measures to prevent misuse.

But, outside of that there is Eleven Labs which offers a similar feature. For only around $6, you can clone a voice from a one-minute audio sample. Is this potentially abused? Absolutely.

Imagine, with social media being freely accessible to everyone, scammers can get voice samples of anyone. They just need to pay a few dollars, then get the desired voice clone, and use it to deceive their victims.

READ:  iPhone 16 Pro & Pro Max: What’s New and Improved?

The most common cases are scams against grandparents. They pretend to be their grandchildren, urgently asking for money for some important reasons.

As grandparents, of course, they don’t want anything bad to happen to their grandchildren, right? Especially when they get a call, and the scammer who imitates their grandchild’s voice creates a scenario like getting into an accident or being arrested by the police.

And usually, these scammers tell the victims to keep quiet and not tell anyone, especially their parents, about this phone call. Many people have become victims, because the imitation of the voice created by AI is so ‘real’.

Although victims may realize something is wrong, they may still be deceived if they are not careful. For example, when suspecting something is wrong, would the grandparents accuse their grandchildren of being scammers? And the scammer will probably just deny it.

That’s the problem. Many people are deceived by the sophistication of AI today, which is exploited by irresponsible people. So, how do we protect ourselves from AI-based voice cloning scams?

As quoted from PC World, one way is to have a family meeting and agree on a password to be used in an emergency. If you get a call from a scammer pretending to be a family member and asking for money, try asking for that password.

If they can’t answer it, then it’s very likely that it’s a scammer calling you. Yes, sometimes to overcome the sophistication of a technology that is misused, we have to use ‘old-fashioned’ methods as well.

Another solution, try installing an anti-spam call application, such as Truecaller. This application has a system that can prevent spam calls, preventing its users from fraud via phone calls.

READ:  You Might Not Be Aware, But There's an AI Gemini Feature in the Gmail App

Leave a Reply

Your email address will not be published. Required fields are marked *