During a press conference, The Russian president, Vladimir Putin was confronted by an AI-generated deepfake of himself, where, for a second, he looked as if he was lost for words. The doppelganger was actually a student from St. Petersburg Institute, and they took this opportunity to ask Putin himself about his perspective towards the dangers of AI.
To which Putin responded, “I see you may resemble me and speak with my voice. But I have thought about it and decided that only one person must be like me and speak with my voice, and that will be me”. He later adds, "Whether we should fear artificial intelligence or not … it's impossible to prevent it. That means we should head and lead the process.”. However, Putin is not the only politician who was deepfaked the past couple of weeks.
The recent deepfake of Congresswoman Alexandria Ocasio-Cortez explaining ceasefires has sparked a conversation about the dangers of misinformation and the importance of deepfake detection. The deepfake was so realistic that Reuters had to write a Fact-Check article stating that it was digitally altered.
The deepfake has been shared widely on social media, with many users believing that it is real. This has led to a wave of negative comments about Ocasio-Cortez, including some that are offensive and threatening.
“Stupid is as stupid does! And this is who is in charge of the country. No wonder we are in such bad shape,” Facebook (archive)
The alarming number of individuals falling for these deepfakes raises red flags, especially as we observe a concerning evolution in scamming tactics, namely, the new caller ID spoofing scam.
A new Caller ID spoofing scam has emerged, enabling fraudsters to not only claim to be associated with a specific organization but also manipulate their phone numbers to match with that organization. This technique is not limited to organizational numbers alone but extends to impersonating individual phone numbers as well.
Imagine receiving a call from a family member, complete with their genuine phone number and an eerily accurate replication of their voice. The caller urgently requests financial assistance for an emergency situation. However, the truth is that it's not your family member on the line but a scammer using deepfake audio to heighten the authenticity of their deceit.
As deepfakes become more difficult to detect, we must be vigilant in our consumption of media. We should evaluate the information we see and hear, checking for inconsistencies and seeking out multiple sources.
As we face these challenges head-on, we appreciate your continued support. Thank you for being part of our journey toward a safer and more informed digital world!