Tech

Voice deepfakes are calling—here’s what they are and how to avoid being scammed


voice

Credit: Pixabay/CC0 Public Domain

You’ve just come home from a long day at work and are about to sit down for dinner when suddenly your phone starts vibrating. On the other end is a loved one, be it a parent, child or a childhood friend, begging you to send them money right away.

You ask them questions, try to understand. There’s something wrong with them answer, vague or unusual, and sometimes with a peculiar delay, almost as if they were thinking a little slowly. However, you’re sure it’s definitely your loved one speaking: It’s their voice that you hear, and the caller ID is showing their number. Excited by their panicky weirdness, you seriously send money to Bank account they give you.

The next day, you call them back to make sure everything is fine. Your loved one has no idea what you’re talking about. That’s because they’ve never called you—you’ve been fooled by technology: a deep fake voice. Thousands of people have scam this way in 2022.

EQUAL computer security ResearchersWe’ve found that continuing advances in deep learning algorithms, audio engineering and editing, and synthetic speech generation mean it’s increasingly possible convincing simulation of a person’s voice.

Worse still, chatbots like ChatGPT are starting to generate actual scripts with adaptive real-time feedback. Via combine these technologies with voice generationa deepfake goes from a static recording to a live, lifelike avatar that can convincingly chat over the phone.






The ability to copy one’s voice is increasingly within reach of anyone with a computer.

Voice cloning

Creating a compelling high-quality deepfake, whether video or audio, is not an easy thing to do. It requires artistic richness and technology skillspowerful hardware and fairly large target voice samples.

More and more services are provided for produce medium to high quality voice transcripts for a feeand some voices deepfake the tool needs a sample of only one minute longor even just a few seconds, to create a voice copy that might be convincing enough to fool someone. However, persuading a loved one—for example, in an impersonation scam—will likely require a significantly larger sample.

Protection against scams and misinformation

With all that said, we are in The DeFake Project of the Rochester Institute of Technology, the University of Mississippi and Michigan State University, and other researchers are working to make it possible to detect deep fake video and audio and limit the harm they cause. There are also simple and everyday actions you can take to protect yourself.

For beginners, voice scamor “vishing”, the scams described above are the most likely voice scams you may encounter in Daily life, both at work and at home. In 2019, one Energy company scammed 243,000 USD when criminals simulate the voice of a parent company boss to order an employee to transfer money to a supplier. In 2022, everyone has cheated out of an estimated 11 million dollars simulated voice, including close voice, personal connection.






The researchers were able to transcribe the voice with just 5 seconds of recording.

What can you do?

Pay attention to unexpected calls, even from people you know well. This doesn’t mean you need to schedule every call, but at least it helps with emailing or texting ahead of time. Also, don’t rely on caller ID, because that can also be faked. For example, if you receive a call from someone claiming to represent your bank, hang up and call your bank directly to confirm the legitimacy of the call. Make sure to use the number you wrote down, saved in your contact list, or you can find on Google.

Also, be careful with your personally identifiable information, like your Social Security number, home address, date of birth, phone number, middle name, and even the names of your children and pets. . Scammers can use this information to impersonate you with banks, brokers, and others, enriching themselves while bankrupting you or ruining your credit.

Here’s another piece of advice: know yourself. Specifically, know your intellectual and emotional biases and weaknesses. This is good advice in life in general, but it is key to protecting yourself from manipulation. Scammers often find a way to speak out and then take advantage of your financial worries, political ties, or other predispositions, whatever those may be.

This sanity is also a reasonable defense against misinformation using voice deepfakes. Deepfakes can be used to take advantage of your confirmation bias or what you tend to believe about someone.

If you hear someone important, whether from your community or the government, say something that seems out of the ordinary to them or confirm your worst suspicions about them, you should be careful. important.

Provided by
Conversation


This post was reposted from Conversation under Creative Commons license. Read original article.Conversation

quote: Incoming voice deepfakes—here’s what they are and how to avoid scams (2023, 20 March) retrieved 20 March 2023 from https://techxplore.com/news/2023 -03-voice-deepfakes-callinghere-scammed.html

This document is the subject for the collection of authors. Other than any fair dealing for private learning or research purposes, no part may be reproduced without written permission. The content provided is for informational purposes only.

news7f

News7F: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button