All case studies

When Client Data Gets Pasted Into AI Tools

Example

The situation

A junior accountant is preparing a tax summary for a client. To speed things up, they copy client names, tax references, payroll figures, company financials, and notes from internal working papers into ChatGPT, Claude, Gemini, or Copilot.

They are not trying to do anything wrong. They are trying to be productive.

But the moment they paste that information into an unmanaged AI tool, the firm may lose control over where that data is stored, who can review it, and whether it may be used to improve AI systems.

This is already a known business risk. Samsung reportedly banned staff use of generative AI tools after employees uploaded sensitive code and internal data into ChatGPT. Bloomberg reported that Samsung acted after discovering staff had uploaded sensitive code to the platform.

Google’s own Gemini privacy guidance warns users not to enter confidential information they would not want a reviewer to see or Google to use to improve services. OpenAI also states that users have data controls that determine whether conversations help improve models, which means firms need clear rules for which AI tools staff can use and under what settings.

Why this matters for accounting firms

Accounting firms handle unusually sensitive information:

* Client names and addresses
* UTRs, NI numbers, EINs, SSNs, and tax references
* Payroll records
* Bank details
* Management accounts
* Tax planning notes
* Director and shareholder information
* Confidential client correspondence

A staff member using AI casually could expose more sensitive client data in 30 seconds than they would ever send in a normal email.

The training lesson

Staff need to understand:

* Which AI tools are approved
* What data must never be pasted into AI tools
* How to anonymise prompts
* When to use enterprise-protected tools such as Microsoft 365 Copilot
* When to ask a manager before using AI
* How to handle confidential client information safely

Microsoft says Microsoft 365 Copilot data is not used to train foundation models and is processed under enterprise data protections, but that does not automatically apply to every AI tool or every personal account staff might use.

Ready to do the same for your firm?

Set up takes minutes. Your whole team can complete training in under two hours.

More case studies

Deepfake Video Impersonation

Read →