Deepfake Video Impersonation
Example
The situation
A finance employee joins what appears to be a video call with senior colleagues. The people on the call look and sound like real executives. They instruct the employee to make a series of transfers.
The employee follows the instruction because the video call appears legitimate.
This is no longer theoretical. In 2024, a Hong Kong finance worker was reportedly tricked into transferring about $25 million after fraudsters used deepfake technology to impersonate the company’s chief financial officer and other colleagues on a video call.
Deepfakes are AI-generated synthetic media that can depict someone saying or doing something they never said or did. Proofpoint describes deepfakes as synthetic video, audio, or media used to convincingly impersonate people, and notes that modern deepfake tactics include voice cloning and synthetic video calls.
Why this matters for accounting firms
Accounting firms are built on trust. Staff routinely act on instructions from:
* Partners
* Clients
* Finance directors
* Payroll contacts
* Bank representatives
* Internal managers
That trust is now exploitable.
A deepfake video call could be used to request:
* Bank detail changes
* Payroll rerouting
* Urgent payments
* Client file access
* Password resets
* Access to cloud accounting platforms
* Changes to Companies House or HMRC credentials
The training lesson
The key point is simple:
Seeing someone’s face on a video call is no longer enough.
Staff should be trained to verify high-risk requests through a second channel, especially when the request involves money, credentials, sensitive data, or changes to client records.
Examples:
* Call the known phone number already stored in the CRM
* Confirm through a separate email thread
* Require two-person approval for bank changes
* Never use contact details provided inside the suspicious request
* Escalate unusual urgency or secrecy