It was built to predict how long clients would need ACC cover, and then used to stream them and target interventions, by using past claims decisions about thousands of ACC clients - without consent.
Critics say the lack of transparency is unacceptable, particularly considering the corporation's poor track record with sensitive information.
Experts are also concerned the model could be discriminatory, for example treating similar injury claims differently depending on factors like age of ethnicity.
Overseas, such tools- which use a method called predictive analytics - had been found to contain heavy racial bias, such as a sentencing tool in the United States which was biased against black people. This is because they inherit bias from human decisions.
They also have issues with accuracy. For example, a local version built to predict children at risk of child abuse was wrong 50 per cent of the time - and so constant revision is now considered best practice.
Transparency around decision-making - allowing people to see what information was used to make decisions about them and correct it - is also part of new guidelines.
Predicting risk with computer algorithms is a method increasingly used by public agencies around the world, particularly where they deal with large amounts of data.
New Zealand's most notable case so far has been at the Ministry of Social Development (MSD), which is creating a tool to predict children at-risk of child abuse. It is subject to high public scrutiny.
In an interview with the Herald this afternoon, Woodhouse said the ACC model was different to that used at MSD.
"It's a different thing. Because that is predictive analytics. That is predicting how someone at a young age may end up in the future if things didn't happen," he said.
"What ACC is doing is caring for people who have got injured. Not who might have got injured."
When the Herald said that ACC's tool was also referred to as predictive analytics in its annual reports, Woodhouse said he didn't know why that was.
Woodhouse: "The system doesn't predict, it looks backwards and says what is the history of the claimants that have had those conditions in the past."
Herald: "But that's what predictive analytics are."
Woodhouse: "I'd call that historical analytics but let's not split hairs."
He said the analytics did not determine what a claimant would get. They simply helped staff know when a client was not getting better in time. It did not inform decision-making, he said.
"To suggest the database guides - It doesn't - it provides flags along the way saying if person isn't reaching these milestones maybe we need to do something different and I think that's a good thing."
He said it was possible ethnicity and gender were factors in the model's estimates, but did not think that was problematic.
Asked if he could confirm all the factors in the model, he said: "What does it matter?"
"The more the merrier. If you were older, if you were 60 and not 20 it's possible that you're going to need longer in your rehabilitation plan and that guides good decision-making by case managers, the idea that it's somehow secret or inappropriate is ludicrous."
He said he would find out from ACC if the model had been tested for bias. ACC are yet to answer questions about reviews, and the lack of transparency.
Woodhouse said the Herald would have to ask ACC to release further details of the model, although he didn't think the public were "that interested".
Earlier today, both the Green Party and the Labour Party said the model should have gone before the Privacy Commissioner.
They said the public had the right to know what the Government was doing with their information.
Both parties said they agreed ACC needed independent oversight. Independent advocates Acclaim Otago have suggested the need for a Personal Injury Commissioner to monitor the corporation and provide advocacy.