GovernanceREF: GOV_002|READ TIME: 4 MIN

There is no such thing as "Free AI."

If you aren't paying for the product, you are the training data. Why £20/month is the cheapest insurance you will ever buy.

Alex Harvey

Alex Harvey

Lead Technologist

Executive Summary

  • Free AI tools legally own the data you input.
  • Paying for "Pro" does not automatically secure your data.
  • The Fix: Provision Enterprise seats AND configure the settings.

I conduct a lot of audits. There is a conversation I have almost every week. It usually goes like this:

Me: "Do you allow your staff to use Generative AI?"

CEO: "Absolutely not. We blocked ChatGPT on the firewall. It's too risky."

Me: "Okay. Let's ask your Marketing Manager."

Marketing Manager: "Oh, I use it every day. I just do it on my phone using 4G. It writes all our newsletters."

This is Shadow AI.

Your staff are not malicious. They are efficient. They have found a tool that does their job 10x faster. If you block it, they won't stop using it; they will just stop telling you about it.

The "Free Tier" Trap

Most employees use the free version of ChatGPT, Claude, or Gemini. It costs £0. It requires no credit card. It seems harmless.

But read the Terms of Service.

When you use the free tier, you are explicitly granting the provider the right to use your inputs to train their future models.

"Free Tier = Public Domain."

The Data Sovereignty Equation

If your HR manager pastes a disciplinary letter into free ChatGPT to "polish the tone," that personal data is now training data. If your developer pastes proprietary code to "debug it," that code is now in the wild.

You are leaking Intellectual Property in exchange for convenience.

The Configuration Trap

The solution is not to ban AI. The solution is to pay for it. But paying is only Step 1.

Many business leaders assume that upgrading to a "Pro" or "Team" plan automatically secures their data. This is false.

Many AI vendors default their settings to "Improve the Model" (Training: ON) even for paid users. You are paying £20 a month, but you are still leaking data until you manually intervene.

You must go into the Admin Console. You must find the Data Controls. You must toggle "Model Training" to OFF.

If you do not configure the tool, you are just paying a premium for the same risk.

The Enterprise Shield

For the price of a takeaway lunch (£20), you convert a liability into an asset.

  • Zero Training: Your data stays yours (if configured).
  • SSO: Revoke access instantly when staff leave.
  • Copyright Shield: Legal indemnity for generated output.

The Governance Strategy

You cannot fight efficiency. If a tool saves your staff 5 hours a week, they will use it. Your job as a leader is to ensure they use it safely.

1. Audit the Usage: Find out who is using what. Don't punish them; ask them *why*.

2. Provision the Tools: Buy the Enterprise licenses. £200 a month for 10 seats is negligible compared to a GDPR fine.

3. Configure the Settings: Do not trust the defaults. Verify that training is disabled.

Conclusion

AI is a superpower. But like any power tool, it needs safety guards.

Stop letting your staff play in the traffic. Build them a fence.

Do you know your exposure?

My Strategic Risk Audit identifies exactly where your staff are using Shadow AI and checks if your settings are actually secure.

Request an Audit