Crypto fraud through deepfakes – how artificial intelligence creates new models of deception

Cryptocurrency fraud has reached a new dimension in recent years. While classic phishing emails and poorly designed fake websites are becoming increasingly ineffective, perpetrators are now increasingly resorting to other methods. artificial intelligence and deepfake technology Back. Highly realistic videos, deceptively real voices, and even seemingly authentic live conversations are deliberately used to build trust and trigger financial transactions.
This development marks a new level of escalation in digital economic crime.

Deepfakes as a tool in modern investment fraud

Deepfakes are AI-generated audio and video content that can imitate real people almost perfectly. Facial expressions, speech, mannerisms, and reactions now appear so authentic that even attentive viewers can hardly detect any manipulation. Scammers are increasingly using this technology for fake investment offers in the crypto sector.

Victims see or hear supposedly well-known personalities, alleged company representatives, or purported government officials promoting "safe" crypto investments or warning of urgent security problems. The combination of image and sound significantly increases credibility—and lowers the barrier to following the instructions.

Billions in losses due to AI-powered crypto fraud

International analyses illustrate the scale of this development. In 2024 alone, known losses from cryptocurrency fraud worldwide totaled several billion US dollars. A growing proportion of these losses can be attributed to fraud schemes that specifically utilize deepfakes.

Law enforcement agencies report on professionally organized criminal groups, particularly in Asia, who systematically use AI-powered deception. At the same time, the number of cases is also rising significantly in Europe and North America. A striking feature is that the damage per case is increasing, as victims are no longer deceived in the short term but manipulated over longer periods.

Simulation of authority and psychological pressure

A key feature of modern deepfake fraud schemes is the deliberate staging of authority. Perpetrators pose as well-known businesspeople, stock market experts, or government investigators. In some cases, they conduct video or voice calls lasting several days to build trust and exert psychological pressure.

This form of digital manipulation combines technical deception with social coercion. Through professional interviewing techniques, convincing technical language, fabricated backgrounds, and seemingly official documents, a situation is created in which even critical individuals make decisions they would never make under normal circumstances.

Deepfake support and remote access as a particularly dangerous tactic

Currently, there is a particularly high number of fraudulent support calls from cryptocurrency exchanges or wallet providers. The AI voices used sound credible, react appropriately to the situation, and employ industry-specific terminology. Victims are warned of alleged security incidents and urged to take immediate action.

Later, the perpetrators often request remote access to computers or smartphones. Under the guise of a technical review, they gain access to wallets, banking apps, or two-factor authentication codes. The actual transfer of assets often occurs within minutes.

Particularly affected: older users and a high number of unreported cases

Statistical analyses show that older users are disproportionately affected. Many have little experience with AI-based deception or deepfake technology and rely on the seemingly official nature of the initial contact.

Furthermore, there is a high number of unreported cases: out of shame or uncertainty, many victims do not report the fraud or only do so very late. The actual financial damage is therefore likely to be significantly higher than the figures known so far.

Legal and forensic classification

Legally, deepfake-based crypto fraud is regularly considered Fraud according to § 263 of the German Criminal Code This is particularly relevant when financial losses are caused through deception regarding supposed investments or security measures. If the transfer of assets occurs using digital systems – for example, through manipulated wallet access or online transactions – computer fraud under Section 263a of the German Criminal Code (StGB) may also be considered.

If the identity of real persons or institutions is misused using deepfakes, criminal offenses related to data and identity theft (Sections 202a et seq., 269 of the German Criminal Code) may also apply. In the case of organized groups of perpetrators with a division of labor and the intent to repeat the offense, there is often also a commercial approach, which must be taken into account as an aggravating factor.

While the technical complexity of these fraud schemes complicates investigations, it by no means precludes legal and forensic analysis. In particular, payment flows, wallet transactions, communication histories, server and access data, and remote access logs offer reliable starting points for reconstructing criminal sequences and demonstrating responsibilities.

Why classic precautionary measures are no longer enough

The rapid development of AI technologies clearly demonstrates that purely visual or auditory plausibility checks are no longer sufficient. Deepfakes circumvent traditional warning signals and no longer rely on mass fraud, but rather on targeted, intensive manipulation of individuals – resulting in correspondingly high financial losses.

Classification and support through financial forensics

Financial Forensics We support lawyers, companies, and private individuals in the structured investigation of financial fraud, cryptocurrency fraud, and asset concealment. Our focus is on the forensic analysis of payment flows, the evaluation of digital traces, and the verifiable reconstruction of complex cases.

The results are presented in a plausible, legally sound and discreet manner, serving as a reliable basis for legal action, internal assessments or out-of-court settlements.

FAQs – Frequently Asked Questions about Crypto Fraud through Deepfakes and AI

Cryptocurrency fraud using deepfakes refers to fraudulent schemes that employ artificial intelligence to generate deceptively realistic videos, voices, or conversations. The goal is to build trust and persuade victims to make financial transactions with cryptocurrencies.

Deepfakes are used to realistically imitate well-known personalities, company representatives, or alleged authorities. The perpetrators advertise supposedly safe crypto investments or warn of fabricated security problems.

Because image and sound are combined, creating a high degree of credibility. Classic warning signals are often completely absent, meaning even attentive people can be deceived.

The damage is often very high, as victims are deliberately manipulated over extended periods. Worldwide, known losses amount to several billion US dollars per year.

Perpetrators simulate authority, for example as entrepreneurs, stock market experts, or investigators, and sometimes conduct conversations lasting several days. This creates social and emotional pressure that makes rational decisions difficult.

Often, people claiming to be support staff from cryptocurrency exchanges or wallet providers contact victims. They are warned of alleged security incidents and urged to take immediate action.

Remote access gives perpetrators direct access to wallets, banking apps, or security codes. This allows them to transfer assets within a very short time.

Statistics show that older users are particularly affected, as they have less experience with AI-powered deception and are more likely to trust supposedly official instructions.

Deepfake-based crypto fraud regularly fulfills the elements of fraud (§ 263 StGB) and computer fraud (§ 263a StGB), often in conjunction with identity and data misuse (§§ 202a ff., 269 StGB) as well as commercial activity (§ 46 para. 2 StGB).

In many cases, yes. Payment flows, wallet movements, communication histories, and technical access data offer important starting points for a forensic reconstruction of the crime.

Picture of David Lüdtke
David Lüdtke
David Lüdtke is the managing director of Krypto Investigation GmbH and a certified Crystal Expert (CECF, CEEI, CEUI) specializing in blockchain and financial forensics.

Table of contents

Questions on this topic?

Contact us for a personal consultation.