In the US a mother answered the phone in what’s called a ‘virtual kidnapping’ to hear the voice of her daughter sobbing and asking for money. The scammers had found a recording of the girl’s voice and processed it with AI software. The scammers spoof phone numbers so that they look like they’re calling from the real person or bank. Only the daughter was safely home and asleep at the time.
With AI, romance scammers are going to be able to speak in any accent they want, which will fool more victims. Kiwi victims may be less likely to be suspicious of an accent from an English-speaking or other western country than a more exotic one. They can also look like anyone they want.
Deep fakes can also be used in business email redirection scams where a scammer inserts themselves between say a tradesperson or lawyer and their client and requests the payment account to be changed to their own. With deep fakes a customer or client could be fooled by a video or voice call from a scammer who looks and/or sounds like the real tradesperson, lawyer or other genuine payee.
In the case of scams, the Crimes Act, Fair Trading Act and Consumer Guarantees Act provide some protection, but weren’t written with deep fakes, or even overseas-based scammers in mind.
Enforcing that legislation against people who are often anonymous, or have someone acting as a mule in the way, is effectively impossible, says lawyer Arran Hunt, a partner at McVeagh Fleming. “There is also the Banking Ombudsman Scheme, but some have questioned if this is still fit for purpose, as it wasn’t really created with online scams in mind,” Hunt says.
Deep fakes are not illegal under the Harmful Digital Communications Act 2015 [HDC], says Hunt. The definition of intimate visual material under the HDC must involve a victim’s body. Deep fakes don’t.
He says the government missed an opportunity with an amendment act to the HDC in 2022 to cover deep fakes and he believes the law is in urgent need of updating. “The more the public is aware of the issue, the more likely I’ll be able to convince the government to make changes, even if I have to draft the bill.”
Auckland University professor Alex Sims is in favour of updating laws to encompass deep fakes, but says that all the legislation in the world won’t save us. We need to protect ourselves. That means don’t assume what you hear and see is true, even if it comes from an apparently trustworthy source.
There are some developments happening in the area of scam protection. Thanks to public pressure, banks are finding it harder to get away with victim blaming and are finding themselves on the hook to refund some customers who have been scammed.
New Zealand banks have been told by Commerce and Consumer Affairs Minister Andrew Bayly to get confirmation of payee [matching] systems working fast. Matching ensures the name of the account someone is paying is in fact who they think it is, which is standard practice in the United Kingdom. In 2018 PaymentsNZ on behalf of New Zealand banks dumped a short-lived trial that could have saved many victims over the intervening years.
AI has genuine uses in personal finance. It can be used in budgeting and other financial apps to automatically track expenses, provide personalised budgeting, make smart savings recommendations and predict future expenses. It also offers dynamic pricing, with individuals offered different prices according to what the AI algorithm determines the individual is prepared to spend. Ouch.