We have seen these before. Corporate officials conned into sending $25 million to hackers using deepfakes; EU officials thought they were having a conversation with African Union president – and others.
This one is very close to home. IT COULD HAPPEN TO YOU!
Senate Foreign Relations Committee Chair Ben Cardin was the target of a SUCCESSFUL deepfaked conference call with former Ukrainian Foreign Minister Dmytro Kuleba.
During the call (which means that the deep fake was NOT detected before the call), Kuleba was asking questions that the participants thought were very odd and attempting to get intelligence on what the US might support in the war – or not.
The Senate’s security office sent out an email to a select group, which, of course, was immediately leaked. While the email did not say it was Cardin, sources said it was him.
The voice was the voice of Kuleba and likely done with AI.
The memo said that the deepfake was asking “politically charged questions in relation to the upcoming US election”.
The notice said that the fake was technically sophisticated and believable.
While Russia successfully targeting US Senators might not be new, it certainly is concerning. This attack was likely originated by Russia.
What is probably more concerning is someone targeting you in a voice or video call (voice calls are an order of magnitude easier to fake), in an effort to get you to do something, like send money to hackers. This cost one company $25 million recently when the scam was so believable that accounting fell for it. They did not recover their money.
Are you comfortable that your team would not fall for this sort of attack? Are you doing social engineering testing internally? If you think you might need help, please contact us. Credit: Punchbowl