March 5, 2026
Privacy Advisory — OccuNX
Privacy Advisory — Confidential Briefing

Your Staff Is Already
Using AI.
Here's How Client Data
Leaks in 5 Minutes.

A risk briefing for law firms, CPA practices, and wealth management professionals.

Issued by OccuNX  ·  March 2026  ·  occunx.com
The Point

You have a data governance policy. You have a confidentiality agreement. You may even have cyber insurance.

None of that matters if your paralegal is summarizing a client's trust document in ChatGPT, your associate is drafting a demand letter with client names in Claude, or your bookkeeper is uploading a client's tax return to an AI tool to "check the numbers."

This is not hypothetical. It is happening in firms like yours, right now, today. And in most cases, no one in leadership knows.

Risk Exposure

Under most AI platform terms of service, prompts and documents you submit may be used to train future models, reviewed by human moderators, or retained on vendor servers indefinitely.

Your clients did not consent to this. Your engagement letter almost certainly does not address it. Your malpractice carrier has not been asked about it.

Five Ways Client Data Exits Your Firm in Under 5 Minutes

These are real use patterns, not edge cases. If your staff has access to an AI tool and an internet connection, assume these scenarios exist in your firm today.

1
Staff summarizes a client document
Employee uploads a contract, trust, or tax return to ChatGPT or similar tool to get a quick summary.
Client names, SSNs, financial figures, deal terms
2
AI drafting with real client names
Associate pastes a client situation into an AI tool to draft a letter, memo, or brief without removing identifying information.
Identity, legal matter details, financial exposure
3
Using AI browser extensions
Staff installs a "writing assistant" browser plugin that reads page content automatically — including your practice management software.
Entire client files, billing records, case notes
4
Uploading to AI-powered PDF tools
Employee uses an online AI-powered "PDF summarizer" or "document analyzer" to process a client filing.
Tax documents, legal filings, private correspondence
5
Pasting data into AI for calculations
Staff asks an AI to help with financial projections or estate math by pasting actual client figures.
Net worth, asset allocation, beneficiary details
Why This Is a Specific Problem for Your Practice

General businesses face reputational risk when employee data leaks. Your firm faces a different category of problem.

For Law Firms

  • Privilege Attorney-client privilege may be waived when confidential communications are transmitted to a third-party AI platform.
  • Bar Rules State bar rules require reasonable measures to prevent unauthorized disclosure. Passive tolerance of AI tool usage may not qualify.
  • Engagement Letters Most engagement letters predate generative AI. Your current language almost certainly does not address AI data transmission.

For CPA Firms

  • IRS Requirements IRS Publication 4600 and Treasury Circular 230 require safeguards over taxpayer information. Cloud AI tools may not meet the definition of an appropriate safeguard.
  • Consent Gaps Client consent to share tax information with AI platforms is rarely obtained and rarely contemplated in existing engagement terms.
  • State Law Data breach notification obligations may be triggered even when no malicious actor is involved — unintended third-party disclosure counts.

For Wealth Managers

  • Regulatory Scrutiny SEC and FINRA guidance increasingly scrutinizes how client data is stored, transmitted, and processed by third-party vendors.
  • Privacy Notice Submitting client financial information to an AI platform may constitute a data transmission event requiring disclosure under your firm's privacy notice.
  • Fiduciary Exposure AI-generated advice or analysis may create unanticipated liability, particularly when it influences decisions about client portfolios.
Risk Exposure

The AI platforms your staff uses are not your subprocessors. They have not signed a BAA, DPA, or data handling agreement with your firm.

When client data enters those platforms, it exits your control. Period.

What Your Current Policy Probably Doesn't Cover

Most professional services firms have some version of a data security or acceptable use policy. Almost none of them were written with generative AI in mind. Here is what is typically missing:

  • No AI Policy An explicit prohibition — or approved workflow — for using AI tools with client data.
  • No Approved Tool List A list of AI tools that have been vetted for your firm's compliance obligations.
  • No Review Process A process for staff to ask whether a tool is acceptable before using it.
  • No Client Disclosure Language in engagement letters addressing AI tool usage and data handling.
  • No Training Any staff training on what constitutes a "data disclosure" under applicable law or professional rules.

The absence of policy is not ignorance of the rule. It is the rule not being followed.

What a Basic Response Looks Like

You do not need to ban AI. You need visibility and intentional structure. A reasonable starting point for any firm includes:

  1. Audit what AI tools are currently in use — including browser extensions and personal devices used for work.
  2. Map what client data categories exist in your firm and where they reside.
  3. Establish a written AI tool policy with explicit guidance on client data handling.
  4. Review and update engagement letters to address AI tool usage.
  5. For long-term risk reduction, evaluate local or on-premise AI tools that do not transmit data to third-party servers.

The firms that get ahead of this are the ones that treat AI data exposure as a systems design problem, not an IT problem. The question is not whether to use AI. The question is how it moves through your practice — and where client data goes when it does.

About OccuNX

OccuNX is a privacy-first systems and risk consultancy. We work with small and mid-sized professional services firms — law firms, CPA practices, and wealth managers — to map data flows, identify vendor exposure, and reduce unnecessary digital risk. We are not an IT company, a software vendor, or a managed service provider. We do not promise perfect security. We help organizations understand how their data actually moves — and reduce the places it should not go.

Relevant Services for This Advisory

  • Business Privacy Audit — includes AI data exposure analysis and subprocessor mapping
  • AI Data Exposure Advisory — a standalone written assessment of AI-related data risk in your firm
  • Cloud Vendor Risk Tracker — maps which of your vendors are transmitting client data and where

To schedule a consultation or request an AI data exposure assessment: occunx.com

This document is for informational purposes only and does not constitute legal, compliance, or professional advice. Consult qualified counsel for guidance specific to your jurisdiction and practice.

OccuNX  ·  occunx.com  ·  Privacy-First Systems & Risk Consultancy  ·  Not Legal Advice

Keep Reading

By Shad Khattab February 20, 2026
FERPA, COPPA, PPRA: The Privacy Laws Stuck in a Pre-Internet World
By Shad Khattab January 1, 2026
If it’s free, you’re not the customer. You’re the side hustle.
By Shad Khattab December 1, 2025
Re-identification is not a parlor trick, it's an industry
Show More

How to Manuals

December 31, 2025
Tech you don’t know. Privacy you’ll love. And yes, it’s actually easy.
By shad Khattab November 30, 2025
Yes, you can do this in 20 minutes