After a judge finally found that the Horizon had bugs, errors and defects, which were most likely the reason for the cash shortfalls, the Post Office has had to back down and settle large civil claims.
Some of the postmasters are now appealing their convictions, which they say were meted out as they couldn't meaningfully defend themselves and had to plead guilty.
People went to prison for false accounting, some committed suicide, others were bankrupted after facing massive legal costs, and many received criminal records for fraud and theft.
All because the courts believed that computers produce accurate records and evidence, and if any errors occur, they are quickly spotted and easily remedied.
You don't need to be a coder to know that nothing could be further from the truth.
Even the best programmers get it wrong, adding up to 30 defects per thousand lines of code (the average is 120).
It can be annoying small discrepancies, or more serious ones like scientists being forced to rename human genes because Microsoft Excel insists on converting the names to dates.
The long UK experience suggests that not enough thought and effort has been expended by the judiciary on how robust IT systems are, and if the evidence they produce is sufficiently reliable.
Sometimes, life or death hangs on what comes out of a black box that might or might not work as advertised.
Last week, the developers of the encrypted messaging app Signal got lucky as a Cellebrite smartphone digital forensics kit "by truly unbelievable coincidence" fell off a truck.
Cellebrite is used not just by law enforcement around the world like NZ Police but also by repressive regimes seeking to track down opponents and journalists, to imprison and, disturbingly often to torture and kill them.
The Signal developers discovered that the Cellebrite contains ancient software that is full of bugs and vulnerabilities. It doesn't take much skill to run any code you want through exploiting the vulnerabilities. That could involve tampering with, or deleting the messages the authorities are after, or planting malware on police computers.
It's not just buggy software either: there could be subtle hardware bugs that mean software behaves one way on an older processor, but in another manner on newer gear. Yes, this has happened, the Intel "F00F" bug being a famous historic example.
For the accused, it's very hard to protest and gainsay evidence produced by computers. It involves expert knowledge and computer scientist witnesses who can review and understand large code bases to spot bugs and flawed algorithms.
Open source helps here, but it's a mission and a half to discredit computer-produced evidence that's hugely expensive and beyond what most people can do.
It doesn't mean, however, that we should give up and accept judges going "computer says no", Little Britain style, when people protest their innocence in court.
It is very hard, if not impossible, to fully test and prove the correctness and robustness of IT systems, which are becoming increasingly complex and interconnected. Things can and do go wrong in ways nobody foresaw, unfortunately.
As the case of the UK postmasters slowly grinds its way through the courts there is now growing awareness that in legal cases, assuming computers get it right is downright dangerous.
University of London's School of Advanced Study last year published a set of papers on the topic that our local legal beagles would do well to read and mull over if they haven't already.
Considering the enormous expense any legal procedure entails with ensuing lasting serious damage if our learned friends get it wrong in court, perhaps there need to be a direction that if judges and lawyers aren't computer savvy, they should be made to recuse themselves from IT-related cases?