Deepfakes Are Being Used to Scam Businesses. Here Is What to Watch For
Your practice manager gets an audio message that sounds exactly like you, asking someone to transfer funds urgently. Or a staff member receives a video clip of a supplier's director saying something that changes a negotiation. Neither is real. Both could cause serious damage before anyone realises what happened.
Deepfakes - synthetic audio, video, or written content generated by AI to impersonate a real person - are no longer a Hollywood problem. They are showing up in business email scams, fake supplier communications, and targeted attacks on professional services businesses. The technology has become cheap and accessible enough that criminals are using it routinely.
The most common type is face-swapping video, where someone's face is placed onto another person's body. These are increasingly convincing, but they often have tells: lighting that doesn't quite match, skin tone inconsistencies, or hair that moves oddly. Audio deepfakes are arguably more dangerous for business because they're harder to scrutinise. A cloned voice can sound slightly flat or robotic compared to genuine recordings, with unusual pauses or emphasis. If an audio message is asking you or a staff member to do something urgent and financial, that combination alone should trigger a verification call to the person directly - not a reply to the message.
Text-based deepfakes are worth understanding too. AI can now generate emails or messages that convincingly mimic how a specific person writes - their vocabulary, tone, and phrasing. If something reads like your accountant or your lawyer but the request feels out of character, slow down. Check factual claims against other sources. Be especially cautious when the message is designed to create urgency or anxiety, as that pressure is often deliberate. This kind of social engineering is covered in detail in our post on seven ways hackers get into business accounts.
The practical response isn't complicated. Build a habit in your team of verifying unusual requests through a second channel - call the person directly if an email or audio message asks for something sensitive. Treat unsolicited video or audio clips with the same scepticism you'd apply to an unexpected attachment. And report anything that looks like a targeted scam to CERT NZ, who track these threats nationally.
Device security matters here too. Clicking on a deepfake video or audio link can sometimes be the delivery mechanism for malware. If anyone in your team has clicked something suspicious, it's worth having your systems checked. Managed IT support that includes proactive monitoring means threats like this are more likely to be caught before they cause damage, rather than discovered after. Understanding the range of threats your business faces is also easier with a broader view of the types of malware catching businesses off guard right now.
Training your people to recognise these attempts is one of the most effective defences available, and it is also one of the most commonly skipped — something explored further in our piece on security awareness training most NZ businesses overlook. ITstuffed works with professional services businesses across Canterbury to keep their people and systems secure. If you want a quick read on where your current setup stands, book a 15-minute IT Fit Check.
