Learn

Personal Resource Center

How Deepfake Scams Use Familiar Faces to Scam Victims

Money and world wide web icon
Deloitte estimates AI-enabled fraud cost $12.3 billion in losses in 2023 and projects that the annual total will reach $40 billion by 2027.

Considering how many photos and videos people post publicly on social media, scammers can often get their hands on the source material they need. Once they have it, criminals can create deepfakes using AI technology, which they can buy on the dark web9 as a part of cybercrime kits.

Some technology allows scammers to make a recorded video or voice message. This type of media has more limited use because it can't interact with intended victims. But more advanced technology allows for real-time interaction between the victim and the deepfake video or audio. In the case of video, the scammer uses technology to map someone else's digital image onto their own face.10 What the scammer says or does, it appears that the person in the hijacked image is saying or doing. This way, the scammer can engage the victim in conversation, deepening the hoax's effectiveness.


How To Protect Yourself From Deepfake Scams

The news for combating deepfake scams today is a bit grim. "We know we're not prepared as a society," one expert told the L.A. Times.8 While the technology to help detect deepfakes is developing, particularly for large organizations, individual victims are on their own for the time being.10 Here are ways you can protect yourself:

  • Educate yourself and your loved ones. Knowing that deepfakes exist before one reaches you is the most powerful tool, allowing your skepticism to be raised when a friend, family member, or even celebrity suddenly tries to influence your behavior online.
  • Be skeptical. Whether it's a celebrity endorsing a too-good-to-be-true investment deal or a family member asking for bail money, approach every digital media with a healthy dose of skepticism.11 
  • Pick a family code word. Set a code word that everyone in your family can remember, but no one else would guess. If someone reaches out by video or phone call claiming to be a family member in distress, ask for the code.8 
  • Look for AI glitches and limitations. AI isn't perfect. If a call seems suspect, look for oddities like hair or eyebrows that seem strange or light reflecting oddly off of glasses.8 Skin can look too smooth or too wrinkly. The edges of someone's face can look blurry or irregular.12 Ask the person to put their hand in front of their face or turn their head around. These movements can look unrealistic in a deepfake. 
  • Call to confirm an identity. Before sending money or sensitive information to anyone — whether it's family member, friend, or anyone else you know — tell them you'll call them right back. Use the contact information you already have for that person to reach out them to confirm the caller is who they say they are.13
  • Limit what you share online. Protect your images and voice online. Remember that everything you post publicly, from websites to social media, can be fed to AI technology and turned into a deepfake. 
  • Use multi-factor authentication. For any accounts that use facial or voice recognition, use multi-factor authentication, so if scammers have stolen your image, they can't use it to access your accounts.11  

If you believe you've been targeted by a deepfake, report the incident to the FBI's Internet Crime Complaint Center (IC3) and your local law enforcement.14 Also, follow the steps outlined in our guide, "What to Do if You Are a Victim of Fraud," to protect your credit and financial accounts.

Important disclosure information

This content is general in nature and does not constitute legal, tax, accounting, financial or investment advice. You are encouraged to consult with competent legal, tax, accounting, financial or investment professionals based on your specific circumstances. We do not make any warranties as to accuracy or completeness of this information, do not endorse any third-party companies, products, or services described here, and take no liability for your use of this information.

  1. Heather Sullivan, "Houston women scammed by AI-Generated deepfake videos," Fox 26 Houston, published September 23, 2024. Accessed January 14, 2025. Back
  2. CTV News, "Ontario man loses $12K to deepfake scam that used video depicting PM Trudeau," YouTube, published March 27, 2024. Accessed January 14, 2025. Back
  3. Heather Chen and Kathleen Magramo, "Finance worker pays out $25 million after video call with deepfake ‘chief financial officer," CNN, published February 4, 2024. Accessed January 14, 2025. Back
  4. Linda Nordling, "Scientists are falling victim to deepfake AI video scams — here’s how to fight back," Nature, published August 7, 2024. Accessed November 12, 2024. Back
  5. Isabelle Bousquette, "Deepfakes Are Coming for the Financial Sector," The Wall Street Journal, published April 3, 2024. Accessed January 14, 2025. Back
  6. Satish Lalchand, Val Srinivas, Brendan Maggiore, and Joshua Henderson, "Generative AI is expected to magnify the risk of deepfakes and other fraud in banking," Deloitte, published May 29, 2024. Accessed January 14, 2025. Back
  7. Michael Kan, "FBI: Scammers are interviewing for remote jobs using deepfake tech," Mashable, published July 1, 2022. Accessed January 14, 2025. Back
  8. Jon Healey, "Real-time deepfakes are a dangerous new threat. How to protect yourself," Los Angeles Times, published May 11, 2023. Accessed January 14, 2025. Back
  9. Jeffrey Burt, "AI Now a Staple in Phishing Kits Sold to Hackers," MSSP Alert, published October 8, 2024. Accessed January 14, 2025. Back
  10. Reece Rogers, "Real-Time Video Deepfake Scams Are Here. This Tool Attempts to Zap Them," published October 15, 2014. Accessed January 14, 2025. Back
  11. Better Business Bureau, "BBB Tip: How to spot a deepfake and avoid scams," published June 14, 2022. Accessed January 14, 2025. Back
  12. Vermont Secretary of State, "A.I. Deepfakes and Scams," accessed January 14, 2025. Back
  13. National Council on Aging, "Understanding Deepfakes: What Older Adults Need to Know," published October 30, 2024. Accessed January 14, 2025. Back
  14. Internet Crime Complaint Center, Complaint Form, FBI, accessed January 14, 2025. Back