Shadow AI in Your Practice: How to Find Out What Tools Your Team Is Actually Using
It usually starts with a small shortcut. Someone pastes a client letter into an AI chatbot to polish the wording. Someone enables an AI feature inside a software tool because it saves them twenty minutes. Someone asks a bot to summarise a long document. None of these feel like decisions - they feel like sensible use of whatever is available. But once they become routine, they stop being tool choices and become a data problem: what information is leaving your practice, where it is going, and whether you could account for it if you had to.
This is what people mean by shadow AI - AI tools being used by staff without IT oversight, approval, or any consistent policy around them. The risk is not that your team is doing something malicious. It is that they are trying to work faster, and in doing so they may be putting client data, confidential files, or legally sensitive information into systems you have no visibility over. Internationally, surveys have found that a significant share of employees admit to sharing sensitive work information with AI tools without permission. That pattern is almost certainly present in Canterbury practices too, not because staff are careless, but because the tools are easy to access and genuinely useful.
What makes this harder than it sounds is that shadow AI does not always arrive as a new app someone signs up for. It shows up as an AI feature quietly switched on inside a platform you already pay for. It shows up as a browser extension. It shows up as a "copilot" that one of your staff enabled in their account settings without realising it could read files. By the time it becomes visible, it is already woven into how people work.
The other risk that gets overlooked is what happens to data over time. A tool might seem harmless the day someone uses it, but AI systems can retain inputs, use them to improve their models, or share data in ways that were not obvious when the feature was first enabled. Under the NZ Privacy Act 2020, your practice has obligations around how personal information is handled and where it ends up - and "I didn't know the tool was doing that" is not a defence that will hold up if a breach occurs.
A shadow AI audit does not need to be a lengthy project or a signal to staff that they are in trouble. The aim is straightforward: find out what is actually being used, understand which workflows it touches, and make some clear decisions about what stays, what gets replaced, and what needs guardrails.
Start by looking at what you already have access to. Identity logs, browser activity on managed devices, and admin settings in your existing software will often tell you more than you expect. Follow that with a simple, non-judgemental question to staff - something like "what AI tools or features are saving you time right now?" - and you will get honest answers because you are treating it as a support question, not an investigation. Then map out where AI is touching real work: what kind of information goes in, and what comes out. Once you can see that clearly, it becomes much easier to classify the risk. Client information, financial records, and anything that would cause harm if disclosed belongs in a different category from public-facing marketing copy.
From there, the decisions are practical. Some tools will be fine to keep, with managed logins and logging in place. Some will be fine for low-risk tasks only. Some should be replaced with a sanctioned alternative that your IT support can monitor. A few will need to be blocked entirely. The goal is not to eliminate AI from your practice - it is to make sure the tools your team uses day to day are ones you can account for.
If your practice runs on Microsoft 365, there are built-in controls that can help with exactly this kind of governance. Managed IT support that includes Microsoft 365 oversight means someone is watching for these gaps on an ongoing basis, not just when something goes wrong.
ITstuffed works with professional services businesses across Canterbury to get visibility over how their systems are actually being used. A 15-minute IT Fit Check is a good place to start if you are not sure what your current exposure looks like.
