The Ultimate Guide to Impersonation Fraud: Trends, Tactics, and Mitigation


Fake bank representatives, government officials, police officers, or even relatives. Who hasn’t heard of impersonation fraud, one of the most popular and effective tactics of scammers?

In impersonation fraud, fraudsters deceive their victims by pretending to be trustworthy individuals, causing deep financial, reputational, and psychological damage.

The surge of impersonation fraud is best illustrated by data. In the US, imposter fraud* was a top fraud category in 2023 with more than 856,000 reported fraud cases and a significant increase in business and government impersonators, according to the Federal Trade Commission (FTC). Reported losses to the business and government impersonation scams amounted to $1.1 billion, more than three times what consumers reported in 2020.

Why is impersonation fraud so effective?

Impersonation fraud is a typical example of social engineering, and as such exploits human psychology. 

  1. Mimicking trusted institutions

Fraudsters very often impersonate employees of traditionally trusted institutions—typically bank agents or bank security personnel, investment advisors, police officers, or government officials. These individuals have strong authority over the general population, so people are more likely to listen to their instructions. This is also why some impersonators pose as top executives of companies and target their “employees.”

  1. Using fake documents

To establish trust, scammers employ other tactics as well. They often present victims with documents that appear to prove their credibility—certificates, references, and other fake evidence. Another nail in the coffin of victim caution is a spoofed phone number or compromised business email.

  1. Providing truthful information – initially

Criminals often provide the victim with accurate information—at first, that is. This tactic, known as pretexting, is an integral part of social engineering. Indeed, in the era of the internet, it is not difficult to simply Google information that will seemingly confirm the impersonator’s identity to the victim. According to a report by Omdia, it only takes about 100 minutes of internet searching. 

  1. Creating time pressure

Once the scammer has established authority and trust with the victim, they have effectively succeeded. A demand (sending money, sharing sensitive data) is usually followed by a sense of urgency, giving the victim no time to question the scammer’s request. 

The combination of these factors makes the impersonation strategies of fraudsters extremely effective.

The worrying deepfake factor

Impersonation scams are responsible for serious financial damage to both consumers and companies. In 2023, government impersonators cost US consumers $618 million, with a median loss of $1,400. Scammers posing as US Customs and Border Protection caused the highest average damage at $4,200. Meanwhile in the UK, the average loss per victim of an impersonation scam amounted to £7,448.

In the future, the use of generative artificial intelligence, particularly deepfakes, will only add fuel to the fire. A recent example is the British engineering giant Arup which fell victim to a sophisticated CFO scheme. Fraudsters used deepfake technology to pose as the company’s chief financial officer in a video conference call, tricking a finance worker into paying out $25 million. 

Additionally, deepfake videos imitating high-profile individuals and celebrities increasingly target the general public. Examples include frequent deepfakes of Elon Musk used for investment/cryptocurrency scams.

The impact of impersonation scams on banks

In addition to financial loss, impersonation scams cause significant reputational damage. They undermine consumer trust in a secure digital environment and in the institutions that fraudsters mimic.

The situation is particularly challenging for banks. Firstly, the impersonation of bank representatives is one of the fraudsters’ go-to strategies, posing a serious reputational threat to banks. Secondly, according to 2023 FTC data, bank transfers accounted for about 40% of reported losses to US government and business impersonators.

The upcoming liability shift

Finally, banks are facing yet another major change. Due to the growth and damaging impact of impersonation scams, this type of fraud has come under the spotlight of regulators, who are increasingly requiring banks to compensate victims.

In the UK, for example, the compensation rate for victims of police/bank staff impersonation scams is 78%, the highest of any scam category. In the proposed PSD3, the EU is moving in a similar direction, requiring mandatory compensation for victims of bank impersonation scams. 

The FTC has also recently stepped into the fight against impersonation scams. The new rule on government and business impersonators gives the FTC stronger tools to combat and deter these scammers, enabling them to file federal court cases seeking to return money to injured consumers and impose civil penalties against violators.

While compensating victims is probably the only way to protect reputations—especially when it comes to fake bankers—it presents a big challenge for banks. To avoid high compensation costs, they will need to improve their fraud prevention and detection mechanisms. A large proportion of impersonation scams fall into the category of authorized push payment (APP) fraud where legitimate customers make a seemingly legitimate payment—but under false pretenses.

Strategies to combat impersonation fraud

As impersonation scams use a wide range of fraudulent methods, they require a comprehensive approach to detection and prevention. Customer education and awareness campaigns are essential—financial institutions and other organizations need to teach their clients how to differentiate between legitimate and fraudulent requests. Similarly, they should inform their clients how to verify any unusual requests from people who contact them.

Another important aspect of fraud prevention is the use of advanced detection mechanisms and technologies. One of the most proven in this regard is behavioral intelligence. Its advantages lie mainly in its ability to detect fraud in real time across all digital channels based on a variety of signals.

How to protect customers while disrupting impersonators

ThreatMark’s Behavioral Intelligence Platform, for example, combines behavioral data with other inputs. Therefore, it can assess whether:

  • the user is legitimate, 
  • the user is behaving in their usual way,
  • the transaction has certain risk factors (e.g., a new beneficiary, an unusually high amount, an instant payment request),
  • the device is free of financial malware,
  • remote access tools/trojans are used on the device,
  • the user is talking to someone on the phone during the transaction. 

Individually, these factors might not mean much, but when considered together and in context, they can detect a scam with unprecedented accuracy.

Another key benefit of the Behavioral Intelligence Platform is its comprehensive approach. It not only prevents fraudulent transactions but also detects attempts to mimic legitimate banking platforms and identifies attackers’ infrastructure, devices, tools, locations, payment methods, and vendors. This leads to disabling the entire fraud network, protecting bank customers at scale.

Preparing for the future

Impersonation scams pose a major threat to the future of the digital environment, which will only be exacerbated by the development of AI-generated deepfakes. To keep up with fraudsters, avoid financial and reputational losses, and meet the demands of regulators, banks must adopt advanced detection and prevention technologies. Investing in these tools will help secure the digital landscape and restore consumer trust.

* Federal Trade Commission defines imposter scams as fraud where someone pretends to be a trusted person to get consumers to send money or give personal information. Examples include scammers posing as a government employee/agency, a company, a friend, a relative, a romantic interest, etc. 



Source link