Increases in fraud have been accompanied – and driven – by what a recent report calls “a quantum shift in its nature, sophistication, and complexity”. The trend is up, and I’d bet anything you like that will continue.
Much of this has been driven by the rise of cyber fraud.
Cyber fraud can be particularly hard to investigate and respond to because many of the perpetrators are overseas. Organised fraud groups recognise that international barriers make meaningful repercussions vanishingly unlikely, so deliberately target their offending internationally. This is enabled by instant communication and online anonymity, and by virtual currencies like Bitcoin that can be sent and received online without involving any financial authorities.
Cyber-enabled fraud is already bad, but advancing technology is set to throw it into overdrive in the near future. Scammers are already starting to incorporate AI tools into their work. Many of you will have had a fake profile, claiming to be a friend or relative, add you on Facebook, backed by a chatbot designed to get you to give up your credit card details. Soon though, those profiles will be using generative AI to mimic people’s behaviour and modes of speaking near-perfectly, and will even be able to engage in convincing video conversations. The same will be true for fake phone calls, text messages and emails, which will be crafted to target victims’ unique behaviours and vulnerabilities.
People are reporting incidents of AI-enhanced fake kidnapping scams in the US, where scammers call a parent and tell them their child is being held for ransom, using a convincing AI clone of their child’s voice to add urgency and credibility. This type of scam (using an actor instead of an AI clone) has been around for years, but has been made vastly more accessible by the availability of AI voice cloning tools, which can copy a person’s voice from videos they’ve posted on social media. And with the sheer quantity of information about people’s lives that is online, it’s easier to choose suitable victims and to choose a time when the person they’re claiming to have kidnapped is likely to be hard to contact. Currently, this all needs a human to identify the target and manage the phone call, but it won’t be long before AI routines are doing them by the thousand, with no human input needed. Freed from the limitations of the human perpetrator, scams of all types will become vastly more common.
Despite its seemingly obvious significance, a report by the Independent Police Conduct Authority in November 2022 found that fraud was perceived by police “both systemically and culturally, as having low importance and little impact”. This resulted in inadequate processes for receiving and investigating fraud cases, a lack of expertise and training in the area, and a lack of focus on victims. It also identified a “tendency” to erroneously label fraud allegations as civil disputes, meaning that police could avoid having to investigate them.
Skills and knowledge in this area are currently limited. There is “very little specific fraud training” available to officers, and recruits are not taught how to investigate fraud, or how to prove that an offence was committed with intent to deceive. When specialist forensic accounting is needed, it is “often unavailable”, because those who possess it generally tend to work for the Financial Markets Authority or the Serious Fraud Office, whose remit does not include ordinary fraud cases.
The report even found that police often ask victims to do their own evidence-gathering and analysis in fraud cases, a practice that “does not occur with any other type of crime”.
The report identified a “vacuum in national leadership on fraud”, and noted that a truly effective response will need to be a prevention-focused whole-of-government effort that goes beyond just police.
The good news is that there is a government strategy aimed at transnational organised crime, and this is facilitating a better approach. But one key issue is expertise. Police have been well-served by an additional 1,800 new officers, but the skills of your average frontline cop are not going to cut it to counter the sophistication of the fraud coming our way.
Calling all computer nerds to the thin blue line.
Dr Jarrod Gilbert is the Director of Independent Research Solutions and a sociologist at the University of Canterbury.