A small excerpt from OutThink. Enjoy!
Our lives begin to end the day we become silent about things that matter.”
– Dr. Rev. Martin Luther King Jr.
I was sixteen years old at the moment the Challenger Space Shuttle exploded high above NASA Kennedy Space Center in Florida, killing seven astronauts including New Hampshire schoolteacher Christa McAuliffe. Sadly, because a schoolteacher was on board, there were thousands of children around the country watching the event unfold on live television. In the months following, Richard Feynman, renowned physicist, was asked to help understand what happened in the Challenger disaster.
He not only gave a famous testimony to Congress describing the O-ring failure that led to the catastrophe, he also led a more quiet inquiry conducting interviews of the NASA engineers and leaders. He devoted the latter half of his book What Do You Care What Other People Think? to his experience working on the Rogers Commission. One of his sober conclusions was that the engineers on the front lines building the componentry had a much different perspective than the leaders in the organization regarding assessment of risk.
This recognition of risk disparity started when Louis Ullian, the range safety officer at Kennedy Space Center began an inquiry into whether or not to place destruct charges on manned Challenger, or other manned rocket flights. It was common practice at the time to have remote destruct charges placed on unmanned rockets in case something went wrong. The thinking was it would be safer to remotely destruct an out-of-control rocket, rather than have an out-of-control rocket dangerously explode on the ground. Ullian discovered a 4% failure rate among unmanned rocket flights he researched, and calculated that manned rocket flights, with their much higher safety standards and preparations, had about a 1% failure rate. However, when he inquired with NASA he was advised the official probability of failure for a manned rocket flight was 1 in 100,000. He told this figure to Feynman, who replied, “That means you could fly the shuttle every day for an average of 300 years between accidents – which is obviously crazy!”
Ullian needed a figure to inform his decision of whether or not to place remote destruct charges on manned rockets and settled on 1 in 1000 as a compromise. However, with NASA management estimating 1 in 100,000, Feynman became interested in finding out what the acting engineers believed the failure probability rates were.
Feyman requested a meeting with a group of engineers and began asking questions about how the rockets worked, were assembled, etc., in order to make a probability assessment. After a couple hours he hit on a better idea: ask the engineers in the room what their opinion of the risk was. He said to them, “Here’s a piece of paper each. Please write on your paper the answer to this question: what do you think is the probability that a flight would be uncompleted due to a failure in the engine.” He collected and averaged the answers in the room: 1 in 300.
Feynman went on to observe that unlike airplanes or cars, which are built “bottom up” using integrated systems that have been tried and tested over time, the space shuttle, as a unique vehicle was built “top down.” That is, it was conceived as a whole and built from individual and unique parts assembled to suit a finished product. In calculating the potential failure probability, the NASA engineers had evaluated the potential failure rate of each individual component, and extrapolated a failure probability of 1 in 100,000. This makes complete sense when you consider that individually the failure probability of an engine blade, electrical cable or bolt is vanishingly small.
What the engineers knew innately from their hands-on perspective, was that the failure probability of the dynamically assembled whole structure was far higher. Yet even with that knowledge available in the minds of the engineers, it didn’t come out until Feynman asked the question. In order to get closer to the truth and gain higher aspirations of everyone in the organization, first we have to find the hidden truths. And the hidden truths are at the edges of the organization with the people who are closest to the work – closest to the potential problem.
We have to ask those at the edges what they believe to be the truth. As leaders in the organization we have the obligation to say out loud that we don’t know the intricacies of complex projects, and we expect and demand that those closest to the detail surface publicly any concerns.