Callaghan's evidence was overshadowed at the time by a staggering series of revelations to the commission, which finished its final hearings this week and is due to report in September.
But her comments briefly drew attention to the fact that the tragedy was not just a mining accident. As she argued at the inquiry, it was caused by repeated human error on a large scale, which has disturbing implications for all New Zealanders.
Callaghan is the director of Auckland University's human factors group, which starts from the premise that human beings - and their tendency to make mistakes - are at the centre of everything we do, especially in the workplace.
Sitting in a former ward room near her office in the old Auckland Hospital building, she explains it's the study of why apparently smart people do stupid things, often time and time again and even after they've been told not to. The answers tend to involve uncomfortable truths about how we really behave, often for hidden reasons that we may not want to admit.
The 46-year-old was sidetracked into human factors and accident investigation through a love of flying as a young doctor. Posted to Dubbo in the New South Wales outback in her first year as a would-be neurologist, she became hooked on gliding and promptly signed up for an aviation medicine career with the Royal New Zealand Air Force.
Her masters thesis examined how fear of crashing affected the decisions of air force fighter jet pilots to hit the ejector seat button. Using a simulator, she discovered that pilots generally made the right decision above the 10,000 feet ejection safety threshold but made increasingly over-cautious choices to abandon their aircraft the closer they got to the ground. As a result of her work, the air force decided not to lower the threshold after all.
She later became principal medical officer of the Civil Aviation Authority, and in a joint PhD in medicine and psychology examined the unofficial reasons behind doctors's decisions about their patients.
"They're the ones we all know like 'I'm short of time', 'I'm worried that they might take me to the Health and Disability Commissioner', 'Mrs X won't get her operation unless I say she fell over'. If we don't acknowledge them and try to deal with them, how do we expect diagnostic decisions to be any better?"
The most common knee-jerk reaction to an accident, she says, is finding someone to blame. If a patient dies after a nurse accidentally gives the wrong drug the easy answer is to blame the nurse. But she may have been distracted, tired or misread the doctor's poor handwriting and each underlying reason could lead to a different solution.
Callaghan says this does not mean letting people off. A good company has a "just culture" which strikes a balance between encouraging workers to report safety failures without fear of reprisals and reserving the right to take disciplinary action against those who consciously disregard the rules.
Suppose we both drink and drive tonight, she says. "I could make it home scot free, you kill somebody. And there's an element of chance to that... But the conscious disregard is your decision to drink and drive."
Apparently small problems can also have huge consequences. Everyone is familiar with getting into a different car and, in busy traffic, accidentally switching the windscreen wipers on when you meant to indicate. Callaghan says a fatal 1995 plane crash near Hamilton occurred partly because the pilots were flying an aircraft identical to the one they normally used, except for the fuel management system. "Some people died there but the underlying action is the same as you (mistakenly) flicking the windscreen wipers."
Another common kneejerk response is improved training, which she says has become a catch-all corporate response to failure, even though most of us already know when we're doing something wrong.
An apparently crazy decision by a factory worker who removes a safety guard and loses his arm is the same kind of choice we make each time we jaywalk across a busy street.
"The choice I make is not wait at the pedestrian lights or die. It's normally something like 'I'm late for a meeting and it's my boss' - so I dodge through traffic."
Callaghan says it's also unrealistic to say that staff should speak up if everyone knows they will be punished, openly or otherwise, for doing so. This culture of saying one thing but doing another frequently leads to dangerous shortcuts.
"We call them 'routine violations', where the rule says 'x' but everyone does it another way. Everybody's aware that they're doing it another way, including supervisors, and you just get on.
"When the shit hits the fan, that's when somebody invokes the rule again and decides to take out the individual rule-breaker."
Erebus had a strong element of that, she says.
On the face of it, Captain Jim Collins went below the minimum descent altitude and crashed into the mountain. But several pilots - supported by a company brochure - said Air New Zealand routinely ignored the rule to give passengers a better view.
Even the Costa Concordia sinking off the coast of Italy in January may stem from more than just the actions of its notoriously cowardly commander, Captain Francesco Schettino.
Subsequent reports have revealed the company had approved an even closer "sail by" in August, many lifeboats could not be launched because the ship was tilting too sharply and new passengers had not been given a safety drill.
Callaghan's final tip is to avoid making new rules for the sake of it. "We often see solutions implemented before the problem's been defined and then people run around going 'But it's not working'."
Callaghan and her colleague Bridget Mintoft say that, increasingly, some businesses understand their ideas and can see benefits beyond safety. For instance, Mintoft is researching how long personal investors are willing to stick with losing stocks under stress, which many firms could find directly useful.
It's not ivory tower science, says Callaghan, passionately. It's about understanding how to get the best out of human beings "which is actually really sexy". She laughs at her own enthusiasm. "If you do it right it can have an immediate positive effect on people. Whereas exhortations to pay more attention or just do it better or let's get rid of the bad bastards ... (she lowers her voice to a stage whisper) ... it doesn't actually get you anywhere."
Quick fixes- and why they often don't work
1 Fire someone: A new person may make the same mistake
2 More training: Useless if staff were already trained but knowingly did the wrong thing
3 New rules: Unlikely to work if the old rules were ignored, often with the tacit approval of managers