Press "Enter" to skip to content

Fake audio like email phishing is ‘threat waiting to happen’


SAN FRANCISCO — When your boss calls and tells you to wire $100,000 to a supplier, be on your toes. It could be a fake call. 

As if “phishing” phony emails weren’t enough, on the rise now are “deep fake” audios that can be cloned with near perfection to sound almost perfect, and are easy to create for hackers. 

“It’s on the rise, and something to watch out for,” says Vijay Balasubramaniyan, the CEO of Pindrop, a company that offers biometric authentication for enterprise. 

Balasubramaniyan demonstrated during a security conference how easy it is to take audio from the internet and use machine learning to create recorded phrases into sentences that the human probably never said. 

A woman using a smart phone with dollar signs appearing above the phone.

“All you need is five minutes of audio, and you can create fake audio,” said Balasubramaniyan. 

For instance, he showed a database of voices, typed “This morning American forces gave North Korea the bloody nose they deserve,” and connected it to President Donald Trump’s name in the list. A few seconds later, he clicked play, and it sounded eerily real. 

He also presented an example of Facebook CEO Mark Zuckerberg supposedly responding to the $5 billion 2019 fine of the social network for violating privacy by him supposedly saying, “The FTC thinks a $5-billion fine is going to stop us from violating people’s privacy? Suckers.”





Source link

Be First to Comment

Leave a Reply

Your email address will not be published. Required fields are marked *