Researching Risky Business to Learn from Near-Misses
July 24, 2009 – Risk may be encountered by people on a daily basis – the near-misses of tragedies from traffic accidents to natural disasters to space missions. Yet, one particular tragedy compelled associate professor of business Robin Dillon-Merrill to begin conducting risk research.
Years ago, NASA had been warned about the risk of tile damage from debris shedding from a space shuttle’s insulating foam. When the space shuttle Columbia disintegrated upon re-entry into the Earth’s atmosphere on Feb. 1, 2003, the foam was the culprit.
“When we lost the space shuttle Columbia for a reason that was clearly identified in the risk analysis, it showed me something,” says Dillon-Merrill, who joined the McDonough School of Business in 2001.
“There’s nothing wrong with the risk analysis tools. The issue was everything that had happened from when [the risk analysis was done] to when we lost the Columbia and the many different launches when the foam fell off and didn’t cause any problems.”
Dillon-Merrill theorizes that every time a shuttle mission ended without incident, NASA managers likely altered their feelings about the statistical probability of risk, even if the foam kept shedding and the probability of catastrophe had not changed. The Columbia Accident Investigation Report backed that idea, too, saying NASA may have grown complacent about the foam shedding.
Seeing an opportunity to learn from the tragic incident, Dillon-Merrill enlisted the expertise of Catherine Tinsley, another associate professor the business school. Tinsley has published extensively for journals on conflict management, psychology and business.
“What motivates us is that when failures happen, you get a huge investigation of everything,” says Tinsley, who joined the business school in 1996. “But failures are really costly. Can you avoid failures and find early warning signals by paying more attention to near-misses?”
With a $250,000 grant from NASA, Dillon-Merrill and Tinsley set out to explore why people don’t learn from near-misses. The resulting study, “How Near-Misses Influence Decision Making Under Risk: A Missed Opportunity for Learning,” was published in the August 2008 issue of the journal Management Science.
“When you’re faced with a near-miss, a close call, you can interpret it two ways,” Tinsley says. “You could say, ‘Wow, our system was really resilient because this thing missed, and we’re OK.’ Or you could say, ‘Wow, our system is vulnerable because we almost got hit.’ ”
The professors say full learning value of near-misses will only be realized when they are separated from successes to expose system vulnerabilities.
Dillon-Merrill and Tinsley say their research has wide applications, including explaining the current state of the economy and the behavior of financial institutions that engaged in risky investments. According to the professors, instead of managing risk and averting disaster ahead of time, everyone is looking backward to determine just what happened.
“The full learning value of near-misses will be realized only when they are separated from successes and examined to demonstrate not only system resilience, but also system vulnerability,” one of the professors’ studies concludes.
A longer version of this story appears in the spring 2009 issue of Georgetown Business magazine.