Voice Hacking is on the Rise
Hacking is a moving target. And continues to move.
As banks consider using biometric authentication in the place of passwords, hackers are thinking about that too.
Researchers at Black Hat demonstrated that they could synthesize your voice well enough to fool personal digital assistants.
Already there are products on the market from Adobe, Baidu, Lyrebird, Cereproc and others that can do voice spoofing to one degree or another.
Consider this – you have a voice activated system that is trained to recognize your voice – or a person that knows you and would recognize you. But it is not you. It is a piece of software that is pretending to be you,
Over the next few years expect the price of this software to go down dramatically.
A hacker could, for example, embed your voice (or something that pretends to be your voice) in a video or audio clip that he or she tricks someone into playing to compromise something. Just one possible scenario.
Think of this as a complement to the deep fake videos we are already seeing where software puts the head of a, say, political candidate onto the body of a porn star. That is pretty easy today.
Deep fake audio is next.
So what should security professionals, developers, business executives and end users consider?
If something, like biometric authentication, seems too good (or too secure) to be true, it likely is too good.
Consider the risks.
Use it as only one part of the authentication process.
For high risk processes, use two or even three factors.
Sorry. When security meets convenience, convenience usually means poor security.
Just sayin’!
Information for this post came from Entrepreneur.
Thanks for the heads up. You are the most informed cybersecurity professional we are aware of.