In the summer of 2016, a woman in Amsterdam received a letter from the mayor’s office. It informed her that not only had one of her sons – who’d often had trouble with the police – been placed on a list of violent criminal youths, but her other son had been put on a separate database of teenagers who might become criminals in due course.
The second list had been compiled with the help of ProKid, a machine-learning system designed by academics in conjunction with Dutch police. In a scenario reminiscent of Philip K Dick’s novella Minority Report, ProKid used data as wide-ranging as a young person’s address, their relationships, whether they had been arrested (not charged or convicted), whether they had been a “victim or witness of a crime”, and whether they had even been truant from school, to predict whether children aged between 12 and 18 would get involved with crime. The premise was that if the state could intervene in each child’s life now, through social services, psychologists and engagement at school, it could prevent that happening.
In reality, ProKid often had the opposite effect. Children couldn’t understand why they had been stigmatised, and often turned to crime precisely because they felt authorities already saw them as criminals. The mother of the 16-year-old in Amsterdam calls it “a crazy system”, and tells journalist Madhumita Murgia: “It messed his life up.”
As Murgia shows in Code Dependent, her wide-ranging and mildly terrifying book on artificial intelligence, there’s often little explanation of how an AI system makes such decisions, or how to change its rulings. A freedom-of-information request revealed that even Amsterdam’s city authorities “had little idea why the algorithmic system had chosen specific families over others”. These were “opaque systems whose internal workings weren’t fully explainable even by their architects”.