720-891-1663

Return to the list of client alerts

AI Voice Cloning Products Lack Safeguards

Consumer Reports tested voice cloning products from ElevenLabs, Lovo, PlayHT, Resemble AI and Speechify. They found that a majority of the products did not have meaningful safeguards to stop fraud and misuse of their product.

The voice cloning products enable consumers (hackers) to create an artificial copy of someone’s (your boss, your CEO) voice with a very small sample.

Some examples are a voicemail from your kid telling you that they are in trouble and need money and please send it to them at …

Or testimonials (all fake) from celebs endorsing products.

Or, your boss telling you to send them information or wire money somewhere for a secret deal. All sounding extremely real.

If you are not training your employees REGULARLY on this attack vector, they will forget what you told them a year or more ago and do what you said. One company that we know of lost $25 million to this kind of scam.

Do you have $25 mil to spare to gift to North Korea? Contact us for help.

Credit: Consumer Reports