The public sector has failed to learn from its disastrous attempts to implement big IT projects, writes PETER DAVIES*.
Incis, ACCtion, NDIS and PRA chart the dismal history of New Zealand's public sector involvement in major IT projects.
These four projects - for the police, ACC, the National Library and the Justice Department respectively - all collapsed at an estimated cost to the taxpayer of more than $130 million. It is unlikely we will ever be told the cost of other, smaller projects which also collapsed.
At least three current projects are classified as high-risk and are being monitored to make sure that anything up to another $300 million is not flushed down the drain.
Apart from the financial cost, the state sector's inability to successfully deliver major IT projects has destroyed careers, discouraged staff, and caused a loss of confidence in public sector management.
Despite expensive inquiries, expert reports, and declarations that lessons have been learned, New Zealand public sector IT projects seem destined to stumble and fail with predictable frequency.
But what is it about New Zealand's public sector that seems to predispose it to IT project failure?
Following the spectacular collapse of the Incis project in 2000, I wanted to investigate why major public sector IT projects in New Zealand repeatedly failed.
My findings were:
* We should not be asking what went wrong but rather why do things go wrong repeatedly.
* Recently introduced revised IT project guidelines are unlikely to prevent further fiascos.
It is no mystery why IT projects fail, and it should be no mystery in New Zealand, where public sector projects have failed for mostly the same reasons.
For example, underlying the failure of the National Library NDIS project was an attempt to develop unique technology, rather than buying "off-the-shelf".
Following its collapse, subsequent failures in the Justice Department's public registries (PRA) project and Incis were also partly blamed on the same eagerness to develop "bleeding edge" technology.
Evidence indicates two reasons why these mistakes are repeated, with the first being capability fragmentation in the state sector.
Capability fragmentation is probably an unintended result of the state sector reforms and restructuring undertaken since 1987.
As the state sector evolved into large numbers of decentralised entities, successive governments demanded lower costs from it.
Under sustained pressure, less quantifiable measures of performance were abandoned as departments narrowed their focus to mainly outputs and total spending, ignoring behaviours for which no output class existed, such as co-operation and learning between departments.
Under these conditions, it became harder to maintain appropriate levels of capability.
After all, how could any chief executive waste resources helping another department successfully deliver an IT project? There was no output classification for that.
The second reason is a lack of understanding about how far accountabilities should reach.
Revised IT guidelines focus exclusively on a system of bottom-up accountability, with ultimate responsibility for IT projects resting with the chief executive, who is responsible to the minister for that project.
That accountability structure has been in place since state sector reformation began, yet despite the appalling track record no chief executive has been fired as a direct result of a collapsed project.
Ultimately, the ramifications of expensive IT failures encourage politicians and chief executives to mitigate or ignore them in order to maintain reputations, jobs and electoral seats, which could explain the total failure to enforce accountability so far.
The Incis report is a prime example. It is the only central Government effort, so far, specifically to inquire into the failure of a major IT project and it failed to apportion specific blame.
Current accountability frameworks for IT projects fail to accept that some accountability must lie with the minister, as owner of the department. In more than one case of failure, the department was subject to unexpected reviews after an IT project began. The reviews frequently resulted in restructuring, and the attendant disruption was partly to blame for the collapse of the project.
As the owners, ministers should have been aware of the likely effect of the review on strategic IT projects under way yet, in many cases, respective ministers not only failed to protect their own departments from this outside interference, but plainly did not understand the ultimate objective of the IT project anyway.
Under the revised monitoring arrangements, the effect of this type of interference is ignored, with accountability regarded as being due to the Cabinet, rather than a process to apply to itself as well.
By limiting accountability to departments, the current monitoring regime seems primarily designed to give the responsible ministers an opportunity to deny both accountability and responsibility.
What can be done to improve the performance of the sector's IT projects? Within the engineering profession, lessons about the relationship between failure and success are well known. Civil engineering projects have high success rates because they draw on thousands of years of experience, including failures.
The New Zealand public sector also needs to move beyond the highly politicised nature of its failures and learn from them.
The present accountability structure takes no account of what is called "cross-cutting", which is any co-operative work between Government departments.
The present lack of incentive for the public sector to practise cross-cutting means the lessons from failed projects are not available to the whole sector.
In contrast, Sweden, Canada and Britain have made co-ordination of cross-departmental relationships the focus of considerable effort, devoting substantial resources to promoting it, and in one case, making it law.
If future performance is to improve there must be a mechanism to take those lessons and make sure everyone can learn from them.
Central agencies, such as the Treasury, should be promoting a "centre of excellence" in project management, encompassing IT projects.
The centre will be the resource for gathering, analysing and disseminating project-related advice and training across the state sector. Managing business change involving IT is an intricate process, entailing significant risk because of uncertainty.
Obtaining advice from other Government departments, with relevant experience, mitigates those risks.
Promoting cross-cutting strategies so departments consult on areas such as developing or adapting IT infrastructure will significantly improve the public sector's ability to deliver those projects.
Whereas other countries are moving towards regimes that encourage co-operation and sharing, New Zealand's revised model of IT monitoring in its public sector still relies on bottom-up accountability frameworks - that are rarely enforced - and micro-managed process monitoring.
There is no reason to believe it will be any more successful in the future than it has been in the past.
* Since this report was completed the Government has reflected the criticisms I have made in its "review of the centre" report, and wishes to encourage greater co-operation between departments. Its long-term plans to recombine the disaggregated public sector will start to correct some of the problems I have identified, but still fail to adequately address accountability issues.
* Failing To Learn From Failure - IT Failure and the New Zealand Public Service, by Peter Davies, was completed as part of an MBA programme at Massey University Graduate School of Business in July last year. He can be contacted at >.
Dialogue on business
<i>Dialogue:</i> Sting of bleeding edge technology
AdvertisementAdvertise with NZME.