$38 Billion Stolen Your Selfie Did It
Image: AI-generated illustration

$38 Billion Stolen: Your Selfie Did It

⏱️ 5 Mins Read

About 92 million selfie are taken daily by over 5.5 billion social media users worldwide, meaning two in three people on Earth use social media. An average American uses their smartphone to take a photo six times a day.

Moreover, Americans lost $38 billion to identity fraud and scams in 2025 alone, while AI-powered deepfake fraud is now responsible for 11% of all fraudulent activity globally.

Most people never think when posting a selfie of a morning vibe, a birthday party, or a beach, and move on. That image shared on social media is accessible to billions of people, including some who are involved in creating a digital copy of that Selfie to open a bank account, apply for a loan, and then vanish before the first statement arrives, ruining your name, your credit score, and your financial future.

AI did not create this problem, but your impulse to share your photo with your circle of friends did. AI simply turned that impulse into a weapon for criminals.

This is the story of how a single publicly visible photograph of your face has become the source for the most sophisticated identity fraud, and what it costs you when it works, and how you can protect yourself from it.

How A Selfie Is Used for AI Fraud

In 2025, approximately 2.1 trillion photos were taken globally, meaning 61,400 photos were taken every single second. Of those 2.1 trillion photos, 14 billion images were shared daily on social media, with WhatsApp leading at 6.9 billion, followed by Snapchat at 3.8 billion, Facebook at 2.1 billion, and Instagram at 1.3 billion.

Every one of those users is, knowingly or not, contributing to the largest publicly accessible facial database in human history, and fraudsters harvest it for committing identity fraud.

A fraudster identifies your clear, well-lit public social media profile, then uses open-source generative AI tools to process it into a complete photorealistic synthetic video with eye movement and facial animation in minutes.

That video is then fed through a virtual camera driver, a piece of software that intercepts the video pipeline of your phone or computer and substitutes the AI-generated face for a live camera feed. When a bank’s KYC system asks for selfie verification, it receives your face, animated, realistic, and undetectable by most passive liveness checks. The account opens. The loan application goes in. The money is withdrawn. Then the account goes silent. You find out eighteen months later when a debt collector calls about a balance you never owed.

The more data that is available about a target, including photos and audio clips posted online, the easier it is to create a convincing deepfake of that person. Most people hand over that data voluntarily, every day, for free.

The Scale of the Damage: US and Global Losses

Americans lost $38 billion to identity fraud and scams in 2025 alone, while AI-powered deepfake fraud is now responsible for 11% of all fraudulent activity globally.

In 2026, deepfake fraud damages surged significantly, causing over $2.19 billion in financial losses globally, while the United States is the most targeted country, accounting for roughly $712 million in losses. Malaysia, Hong Kong, and the United Kingdom are other most targeted nations for such fraud after the United States.

AI scams, including deepfakes, are expected to surge in 2026, with deepfakes accounting for 11% of all global fraudulent activity, according to Sumsub’s 2026 Fraud report.

According to the Javelin 2026 Identity Fraud Study, identity fraud losses reached $27.3 billion in 2025, affecting 18 million victims. Meanwhile, the FBI IC3 2025 Annual Report documented $20.877 billion in total cybercrime losses and 22,364 AI-facilitated fraud complaints for the year.

Read More: Solar Energy Has a Kill Switch Like Hormuz. China Owns It

The Safer Way to Post

Security experts and cybersecurity organizations have collected a specific set of practical measures. None of them requires you to disappear from social media. All of them meaningfully reduce your exposure.

1. Lock your accounts to friends only immediately. Adjust the settings of social media platforms so that only trusted people can see what you share. Take full advantage of websites’ privacy settings to control who can access your personal information and content. Restrict who can see your photos, videos, and other sensitive data. Reduce the amount of publicly available material, and you minimize the resources potential deepfake creators have.

2. Watermark your images before posting. Watermarking images or videos makes them harder to use cleanly in deepfake generation. Free Tools can be used to watermark images, making them less useful for AI training.

3. Stop accepting unknown followers or friend requests. Make sure that you trust anyone who requests to follow or friend you. An unknown account following you and viewing your photos is functionally a data harvesting operation.

4. Avoid high-resolution, face-forward selfies as public posts. Be cautious about sharing high-quality photos and videos of yourself, your friends, or family members online. The cleaner and higher-resolution facial image is more vulnerable to deepfake generation.

5. Never upload photos to unknown “fun” AI apps. Many viral ‘age me’, ‘AI art’, or ‘cartoon me’ apps collect and store uploaded photos. If the app mentions using your image for research, training, or improvement, it is a red flag.

6. Check if your photos are already in AI training datasets. Websites like Have I Been Trained allow you to verify whether your photos appear in known datasets used to train AI models.

7. Set up an IRS Identity Protection PIN. The IP PIN is a six-digit number issued annually that must be included in any tax return filed under your Social Security number. Without it, the IRS rejects the return, preventing a fraudster from filing taxes in your name and claiming your refund.

8. Create a family safe word. A safe word is a pre-agreed code word or phrase that only you and your trusted group know. It creates an extra layer of security during a person’s identity verification, particularly against voice-cloned deepfake calls.

Read More: Tech Layoffs 2026: Is AI an Excuse to Fool Investors?

What the Law Says, Where It Falls Short

The legal protection against these frauds exists, but it has significant gaps that most victims discover too late. The legal framework against financial identity fraud is complex and primarily designed for remedial measures rather than immediate justice.

To serve a lawsuit and collect damages in court, you must be able to identify, locate, and serve a specific person or entity. Many cybercriminals operate anonymously or in foreign jurisdictions, making it nearly impossible to identify them to initiate a lawsuit.

Victims can only pursue civil claims against known entities like a bank that failed to secure an account for negligence; however, victims can wait months for interim injunctions and years for final judgments.

The Fair Credit Reporting Act or Electronic Fund Transfer Act provides facilitation to remove fraudulent transactions and repair credit, but do not instantly prevent the initial fraud, while bills like the Stop Identity Fraud and Identity Theft Act of 2026, which were introduced in early 2026 to address these gaps, are in the early stages of the legislative process.

The legal framework exists but is incomplete, focusing heavily on fund recovery rather than catching the criminals.

Recovery Steps

Repairing your credit after identity fraud can take weeks or months, but you need to follow these steps.

  1. Report immediately at IdentityTheft.gov.
  2. File a police report.
  3. Freeze your credit at all three bureaus immediately.
  4. Block fraudulent accounts.
  5. Set up an IRS Identity Protection PIN.

According to the 2025 Consumer Impact Report, over 20 percent of victims reported losses exceeding $100,000, and over 10 percent lost at least $1 million. Meanwhile,  36.9 percent reported losses exceeding $10,000, and 19.6 percent reported losses under $500 in 2025. Recovery is possible, but it is neither fast nor easy.

image
BNR Logo

Stay Ahead of the Markets with BNR

Start your day with our top story plus "The Editor is Watching", an exclusive daily analysis on markets, commodities, and currencies.

We don’t spam! Read our privacy policy for more info.

What You Missed