WhatsApp and ChatGPT
Convenient for Staff.  Invisible to You

Do you know where your customers' personal data or confidential business data is right now?

You may not be able to answer that question.  Because it’s not about where it’s meant to be, or where you think it is.  It asks where it actually is.

 

And the truth is that personal data is being shared every day through channels businesses don’t control, can’t monitor effectively, and quite possibly don’t even know about. 

 

Perhaps your sales team is sharing customer details on WhatsApp. Or your installers are sending site photos with identifiable information via personal messaging apps. Maybe your client services teams are pasting customer records into the free version of ChatGPT to help them draft an email faster. 

 

None of these people are being malicious or intend harm. These tools are efficient and they save time. Nothing wrong with them at all – quite the opposite.  That’s why they’re used.  And that’s why it’s so hard to fix.

The Risks

The risks are real.  Under UK GDPR, your organisation is the data controller. You’re responsible for personal data regardless of which app your employee chooses to send it through. When someone shares customer details on personal WhatsApp, that data sits on their personal device, and is backed up to their personal cloud.  It sits outside your policies, it’s invisible to your SAR process, and completely beyond your ability to delete it when the time comes.

 

Free AI tools carry similar risks. Many free-tier AI services reserve the right to use your inputs for model training. That customer complaint or client information or personal data someone pasted in to draft a reply?  It may now be part of a training dataset.  You can’t retrieve it, you can’t delete it, and you quite possibly have no idea it happened.

 

 

Mitigating the Risks

A policy that nobody follows is just a document.  It might give you something to point to if things go wrong, but it won’t prevent the breach and it won’t impress a regulator who asks what you actually did

 

Because writing a policy that says “don’t use unauthorised apps” is easy. But getting people to stop is something else entirely. And that needs a combination of practical steps:

 

Give people a better option. The most effective solution is simple. If your team has access to Microsoft Teams, a business WhatsApp account via the API, or an enterprise AI tool, the temptation to use personal alternatives drops significantly. 

 

Make the training real. In support of your generic training, show your people what actually happens in YOUR business when things go wrong. The subject access request you can’t fulfil because the conversation was on someone’s personal phone. The data breach when a device is lost. The customer complaint you can’t investigate because the evidence is in a private chat. Make the training relevant and they’ll listen.

 

Build in consequences. Your acceptable use policy needs teeth. For employees? It becomes a disciplinary matter. For contractors, distributors, and installers, it goes in the contract – for example, approved communication channels specified in your data processing agreements.

 

Spot-check. Ask.  Just ask your people.  “How did you share that information with the customer?” If the answer is an unauthorised app, you know your policy isn’t working, and you have a trigger to act.

Accept the residual risk

The goal isn’t perfection because you won’t be able to eliminate shadow processing entirely. People are resourceful and the tools are freely available. The goal is to reduce the “why shouldn’t I?” mindset.  It’s about demonstrating that you’ve taken reasonable, proportionate steps to manage the risk.

 

That’s what the ICO would expect.  An organisation that asked the right question, understood it couldn’t answer it. And developed a process to address the issues.  A clear policy.  Practical alternatives.  Tailored training.  And a documented (evidenced) approach to managing what you can’t fully control. 

Data Compliant International

 

Victoria Tuffill | Data Compliant International

4th March 2026

 

Need help getting your messaging and AI use under control?

Data Compliant International helps businesses build practical, proportionate data protection frameworks that work in the real world — not just on paper. Get in touch at 01787 277742 or contact@datacompliant.co.uk.

 

We need your consent to load the translations

We use a third-party service to translate the website content that may collect data about your activity. Please review the details in the privacy policy and accept the service to view the translations.