A New Use for AI and You Won’t Like It
AI, especially generative AI from Large Language Models is THE hot item right now, but other forms of AI are also gaining rapid popularity and specifically that is AI deep fakes of images and videos.
One concern that is on the mind of most politicians as we move into the 2024 election cycle is deep fakes where other politicians seem to do and say things that their neither did or said.
Even though they are fake, such images and videos can have a significant impact on a candidate’s popularity.
These deep fakes could be the work of other candidates or they could be the work of foreign adversaries. In either case, these deep fakes often work. Especially if they have an element of truth to them.
But there is another use of AI and it is very ugly.
Sextortion is the act of threatening targets with publicly leaking explicit images or videos that they either stole or coerced the victim into sending to the malicious actor.
But what if the malicious images or videos were neither stolen nor coerced?
What if the images and videos were produced using publicly available photos, say from social media, combined with AI. The fakes look very real and could be easily used to extort victims or just ruin their reputation. Or, it could be used to inflict pain.
All you have to do is Google the phrase AI Sextortion and you will get an idea of the scope of the problem.
One woman who has been very public of being a victim of this type of attack is Twitch gamer QTCinderella. Another gamer got upset at her and used a deepfake AI online service to create deepfake porn videos using her face and then published the videos. His purpose: to get revenge. He has admitted to doing it.
The FBI says that sextortionists are scraping publicly available images of their targets, feed them into an online AI tool and produce videos and images that look very real.
The FBI has posted an alert on their web site warning people of the problem.
Since the law moves very slowly, these deep fakes may not be illegal. Extortion is, o course, illegal, but if they are not published for extortion then they may not be illegal.
For the victims of these deepfakes – for whatever their purpose – they will hurt the victim and the victim’s families and possibly much more.
At least some victims will be inclined to try to pay off the extortionist, but that may or may not work.
The extortionists could publish one or two images , possibly only slightly embarrassing, with a threat to publish more explicit ones if the victim doesn’t pay.
If you watch QTCinderella talk about her reaction and feelings when her gaming competitor published these videos of her, you get a sense of how deeply hurtful and embarrassing these attacks are. And in her case, it wasn’t even for money, but for revenge against a gaming competitor.
What is even worse is that it is likely almost impossible to stop this. It is not like you can tell people not to click on a link. Unless you want to live like a hermit, there is not much you can do and even then, that might not be enough.
Credit: Bleeping Computer