Aja Rock is a busy mother of four and certainly is not raking in bundles from a bitcoin scheme that she wants you to get into as well. Photo / Norrie Montgomery, File
Opinion
EDITORIAL
It probably started as what seemed like a harmless prank. Let’s superimpose some famous person into a video doing something completely out of character.
But technology has a way of giving something silly unforeseen significance.
By now, many of us will have seen the outcome, perhaps without even realisingit.
Former socialite Aja Rock is just one of an untold number celebrities to have been targeted in “deepfake” scam videos. In her case, the clip was circulated on Instagram, popping up as a doctored video of Rock seeming to promote a bitcoin investment scheme.
She had not done anything of the sort. Last week, Rock told the Herald on Sunday she had warned as many people as she could that she had been hacked and hoped no one had fallen for the scam.
Actor Sir Sam Neill has similarly been targeted in a fake romance scam with fake accounts using his name and image attempting to extract cryptocurrency or cash from his fans in exchange for roles in Jurassic Park, romance or diamond rings.
Neill’s assistant had to point out that Neill would never contact people from an unverified account or ask to continue the conversation on another platform.
Others targeted include Newstalk ZB broadcaster Kate Hawkesby and sporting greats Richie McCaw and Sonny Bill Williams.
It’s big business, and schemers are always looking for more devious and hard-to-expose ways to access our accounts. Innocent Kiwis lost more than $35 million to online scammers in 2022 as parasitic cyber criminals devise even more cunning ploys to prey on people’s trust and steal their money.
Deep fakes are not just trying to fleece people, however. They have exploded into the online adult entertainment scene. While this may seem amusing and harmless, the dark truth is there can be some dreadful consequences.
An artificial intelligence firm named Deeptrace tracked 15,000 deepfake videos online in September 2019, a near doubling over nine months. It categorised 96 per cent as pornographic and 99 per cent of those transposed faces from female celebrities on to porn stars.
As new techniques allow everyday people to make their own deepfakes with a handful of photos, the concern is that fake videos will spread beyond celebrities to be used as revenge porn.
Danielle Citron, a professor of law at Boston University, told the UK Guardian: “Deepfake technology is being weaponised against women.” A Netsafe survey in 2019 found 5 per cent of New Zealand adults - or 170,000 people - had been the victim of online image-based abuse, with instances even reported by people over 70 years old. Ninety-five per cent of the victims were women.
The technology has the real potential to put women, particularly, at risk of jealous violence. The presence of such content places a target in the invidious situation of having to prove their innocence when they have done nothing wrong.
Depending on the content, a deepfake may also infringe copyright, breach data protection law, and be defamatory if it exposes the victim to ridicule. There is also the specific criminal offence of sharing sexual and private images without consent, covered by New Zealand’s Harmful Digital Communications Act.
The spread of deepfakes is another example of technology developing faster than safeguards. The ability to manipulate moving images to appear real is now in too many hands to regulate it.
However, just as technology has created the headache, it may already have the cure.
American multinational corporation and technology company Intel says has developed an artificial intelligence bot that can detect in real-time whether a video has been manipulated using deepfake technology. Its product, FakeCatcher, claims to detect deepfakes within milliseconds and with a 96 per cent accuracy rate.
It may not be long before we all have to install “filters” on our devices to try to help us work out what is authentic and what has been faked. Technology has moved too fast for our own discernment to keep up.
In the meantime, be very wary of hitting that “invest now” button based on who seems to be endorsing it - or believing, let alone sharing, a celebrity sex tape.