Learn
How Deepfake Scams Use Familiar Faces to Scam Victims
Cybersecurity-savvy internet users might feel secure knowing they'd never trust the word of a stranger online. But what about the word of someone you know and trust? Artificial intelligence-enabled deepfake video and audio scams are making it harder for people to recognize a scam.
Just ask Stacey Svegliato from Houston, who received a video call from someone who looked and sounded like a close friend.1 The next thing she knew, deepfake videos of Svegliato herself were sent out from her Facebook account, attempting to sell things to her friends and family — some of whom sent the scammer hundreds of dollars.
The scams are as varied as they are disturbing. CEOs from bank and financial institutions have been used in deepfakes, as well as leaders from other sectors. Stephen Henry of Toronto2 was tricked into sending a scammer $12,000 — all of his savings—based on a deepfake video (below) of Canadian Prime Minister Justin Trudeau endorsing an investment opportunity. Scammers tricked a finance worker in Hong Kong into paying them $25 million, supposedly at the request of her firm's chief financial officer.3 It turned out that every one of her apparent colleagues on a multi-person video call was a deepfake. Even scientists are finding themselves the subjects of deepfake videos.4
With the rise of advanced AI, the next emerging cybercrime blurs the lines between reality and scam in a way many internet users aren't prepared for. One study tracking deepfake scams in the fintech sector found a 700% leap in incidents between 2022 and 2023.5 Deloitte estimates AI-enabled fraud cost $12.3 billion in losses in 2023 and projects that the annual total will reach $40 billion by 2027.6
Here's what you should know about deepfake scams today and how to protect yourself.
What Is a Deepfake Scam?
A deepfake scam happens when a fraudster uses AI-enabled technology to create a video or voice recording — pre-recorded or in real-time conversation — to trick a victim into sending them money or sensitive information. Deepfake technology is a method that cybercriminals can use to execute many types of scams. AI-enabled videos and voice fraud can be part of a romance scam, employment fraud,7 investment schemes, extortion, phishing and other types of financial fraud.
This can happen in various ways. A fake video or voice recording can portray a loved one or a famous personality. The deepfake may reach the victim via a direct phone or video call, or it could be posted to social media or YouTube. The scam may demand money urgently — as in a grandchild asking for money in an emergency — or could lure victims to a supposed investment opportunity or a fake product.
How Do Deepfake Scams Work?
Scammers create deepfake media using advanced AI that can replicate a voice or someone's entire video image. While the technology is complex, it can be quite easy for criminals to use depending on how advanced their particular tool is. Some AI tools are so smart that they can create a deepfake video from a still photo and a few seconds of audio.8 Other tools require several minutes of audio and hours of video.
Deloitte estimates AI-enabled fraud cost $12.3 billion in losses in 2023 and projects that the annual total will reach $40 billion by 2027.
Considering how many photos and videos people post publicly on social media, scammers can often get their hands on the source material they need. Once they have it, criminals can create deepfakes using AI technology, which they can buy on the dark web9 as a part of cybercrime kits.
Some technology allows scammers to make a recorded video or voice message. This type of media has more limited use because it can't interact with intended victims. But more advanced technology allows for real-time interaction between the victim and the deepfake video or audio. In the case of video, the scammer uses technology to map someone else's digital image onto their own face.10 What the scammer says or does, it appears that the person in the hijacked image is saying or doing. This way, the scammer can engage the victim in conversation, deepening the hoax's effectiveness.
How To Protect Yourself From Deepfake Scams
The news for combating deepfake scams today is a bit grim. "We know we're not prepared as a society," one expert told the L.A. Times.8 While the technology to help detect deepfakes is developing, particularly for large organizations, individual victims are on their own for the time being.10 Here are ways you can protect yourself:
- Educate yourself and your loved ones. Knowing that deepfakes exist before one reaches you is the most powerful tool, allowing your skepticism to be raised when a friend, family member, or even celebrity suddenly tries to influence your behavior online.
- Be skeptical. Whether it's a celebrity endorsing a too-good-to-be-true investment deal or a family member asking for bail money, approach every digital media with a healthy dose of skepticism.11
- Pick a family code word. Set a code word that everyone in your family can remember, but no one else would guess. If someone reaches out by video or phone call claiming to be a family member in distress, ask for the code.8
- Look for AI glitches and limitations. AI isn't perfect. If a call seems suspect, look for oddities like hair or eyebrows that seem strange or light reflecting oddly off of glasses.8 Skin can look too smooth or too wrinkly. The edges of someone's face can look blurry or irregular.12 Ask the person to put their hand in front of their face or turn their head around. These movements can look unrealistic in a deepfake.
- Call to confirm an identity. Before sending money or sensitive information to anyone — whether it's family member, friend, or anyone else you know — tell them you'll call them right back. Use the contact information you already have for that person to reach out them to confirm the caller is who they say they are.13
- Limit what you share online. Protect your images and voice online. Remember that everything you post publicly, from websites to social media, can be fed to AI technology and turned into a deepfake.
- Use multi-factor authentication. For any accounts that use facial or voice recognition, use multi-factor authentication, so if scammers have stolen your image, they can't use it to access your accounts.11
If you believe you've been targeted by a deepfake, report the incident to the FBI's Internet Crime Complaint Center (IC3) and your local law enforcement.14 Also, follow the steps outlined in our guide, "What to Do if You Are a Victim of Fraud," to protect your credit and financial accounts.
Important disclosure information
This content is general in nature and does not constitute legal, tax, accounting, financial or investment advice. You are encouraged to consult with competent legal, tax, accounting, financial or investment professionals based on your specific circumstances. We do not make any warranties as to accuracy or completeness of this information, do not endorse any third-party companies, products, or services described here, and take no liability for your use of this information.
- Heather Sullivan, "Houston women scammed by AI-Generated deepfake videos," Fox 26 Houston, published September 23, 2024. Accessed January 14, 2025. Back
- CTV News, "Ontario man loses $12K to deepfake scam that used video depicting PM Trudeau," YouTube, published March 27, 2024. Accessed January 14, 2025. Back
- Heather Chen and Kathleen Magramo, "Finance worker pays out $25 million after video call with deepfake ‘chief financial officer," CNN, published February 4, 2024. Accessed January 14, 2025. Back
- Linda Nordling, "Scientists are falling victim to deepfake AI video scams — here’s how to fight back," Nature, published August 7, 2024. Accessed November 12, 2024. Back
- Isabelle Bousquette, "Deepfakes Are Coming for the Financial Sector," The Wall Street Journal, published April 3, 2024. Accessed January 14, 2025. Back
- Satish Lalchand, Val Srinivas, Brendan Maggiore, and Joshua Henderson, "Generative AI is expected to magnify the risk of deepfakes and other fraud in banking," Deloitte, published May 29, 2024. Accessed January 14, 2025. Back
- Michael Kan, "FBI: Scammers are interviewing for remote jobs using deepfake tech," Mashable, published July 1, 2022. Accessed January 14, 2025. Back
- Jon Healey, "Real-time deepfakes are a dangerous new threat. How to protect yourself," Los Angeles Times, published May 11, 2023. Accessed January 14, 2025. Back
- Jeffrey Burt, "AI Now a Staple in Phishing Kits Sold to Hackers," MSSP Alert, published October 8, 2024. Accessed January 14, 2025. Back
- Reece Rogers, "Real-Time Video Deepfake Scams Are Here. This Tool Attempts to Zap Them," published October 15, 2014. Accessed January 14, 2025. Back
- Better Business Bureau, "BBB Tip: How to spot a deepfake and avoid scams," published June 14, 2022. Accessed January 14, 2025. Back
- Vermont Secretary of State, "A.I. Deepfakes and Scams," accessed January 14, 2025. Back
- National Council on Aging, "Understanding Deepfakes: What Older Adults Need to Know," published October 30, 2024. Accessed January 14, 2025. Back
- Internet Crime Complaint Center, Complaint Form, FBI, accessed January 14, 2025. Back
Do you have questions or ideas?
Share your thoughts about this article or suggest a topic for a new one