Contact us: 0208-123-SAFE

What if? NASA

what if? is a series of blog posts that takes examples of corporate crisis that could have been prevented if the organisations had given their employees a voice and created a psychologically safe place to work.

When I refer to behavioural science, my initial response is the old adage “It’s not rocket science”.  Which is so true when you consider human beings and group dynamics are much more complex and fallible than rockets and aerodynamics.  So it felt appropriate to make the first what if? blog post about NASA.

A Tale of Two Shuttles – Challenger and Columbia

On 28 January 1986, seven astronauts on-board NASA’s Space Shuttle Challenger lost their lives after the vehicle broke up 73 seconds into the flight. The disintegration began with the failure of an O-ring seal in the right Solid Rocket Booster (SRB) that let a plume of hot gases break through.  This plume, acting as a blowtorch, pierced a hole in the wall of the external tank and burned nearly 2 million liters of fuel in just a few seconds.On 1 February 1 2003, NASA’s Space Shuttle Columbia was destroyed in a disaster that claimed the lives of all seven of its crew. Two weeks earlier, on launch day a large chunk of foam broke loose during ascent and struck the left wing. As a result, the vehicle suffered severe overheating and disintegrated during planned atmospheric entry. This was caused by catastrophic damage on the left wing and its breached thermal protection system.

The Tales of Two Engineers – Roger and Rodney

Roger Boisjoly, an engineer for Morton-Thiokol (the company that supplied the SRB for Challenger) demonstrated that low temperatures would compromise the safety of the O-ring seal and recommended that the launch should be delayed. After Morton-Thiokol officially notified NASA of their recommendation, NASA officials asked Morton-Thiokol to reconsider. After a few minutes off the phone to discuss their final (and commercial) position, Morton-Thiokol advised NASA that their data was inconclusive and NASA decided to launch.NASA Engineer Rodney Rocha repeatedly tried to persuade NASA managers to get telescopic pictures of Columbia’s wing before re-entry to better understand the damage. Rocha was ignored by managers, who couldn’t believe a piece of foam could crack open the orbiter’s wing. Rocha still regrets not “breaking the door down” to force NASA higher-ups to get those pictures and maybe develop a last-ditch rescue plan.

Common Cultural Challenges

In the months that followed the Columbia disaster, NASA was forced to re-examine nearly every facet of its operation, from launch safety to how employees talked to one another. The Columbia Accident Investigation Board (CAIB)  came to this stunning conclusion:

“We are convinced that the culture in the Space Shuttle Program was as much a cause of the accident as the foam that struck the left wing.”

The CAIB report pointed out that several factors of NASA’s culture had a direct impact on the failure of Columbia:

  • NASA’s “Can Do” attitude, inspired by past successes, discouraged individuals from speaking out and suggesting “Can’t Do.” The strong cultural bias and optimistic organizational thinking undermined effective decision-making.

  • Viewing near-misses as successes rather than near-failures, discouraging asking the hard questions about tough issues and risks.

  • Focusing on operational schedule, combined with ever-decreasing resources, gradually led managers and engineers to miss signals of potential danger.

  • Efficiency drives reduced the budget for safety, sending employees conflicting messages.

  • The free exchange of information was discouraged and new information was resisted.

Three key actions were amongst the recommendations

  1. Maintain Sense Of Vulnerability

  2. Ensure Candid Communications

  3. Learn and Advance the Culture

Now pay attention, here comes the science bit..

In their studies of Implicit Voice Theories (IVTs) James R. Detert and Amy C. Edmondson identified taken-for-granted beliefs about when and why speaking up at work is risky or inappropriate.  You can see some of these IVTs coming into play with self-censorship at work in the Challenger and Columbia examples.

  1. Presumed Target Identification It’s risky to challenge existing processes because it may be seen as questioning the wisdom of the individuals who established or support them.

  2. Need Solid Data or Solutions (to Speak Up) Saying “I’m not sure” when being questioned about some aspect of a new idea you’re presenting puts you in a bad position.

  3. Don’t Bypass the Boss Upward When you speak up t to your boss in front of people who are even higher in the organization, you make your boss look bad.

  4. Don’t Embarrass the Boss in Public It is important to give your boss time to prepare to discuss a problem or suggestion you have prior to bringing it up in front of a group.

  5. Negative Career Consequences of Voice Speaking up at work about possible improvements sets you up for retribution by those above you who felt threatened by your comments.

 Edmondson posits five important organisational conditions or antecedents to psychological safety.  There are some clear correlations to the CAIB findings on the culture within NASA.

  1. Team Leader Behaviour. Leaders need to be available, invite feedback and admit mistakes.

  2. Informal Group Dynamics. Provide a sense of relatedness, accepted distribution of power.

  3. Trust and Respect. Individuals are respected for their competence and given the benefit of the doubt.

  4. Use of Practice Fields. Creating a safe environment to learn and trial or practice new techniques.

  5. Supportive Organisational Context. Allocate resources proportionately.  Share information freely.

Whatever happened to the heroes?

Rodney Rocha is asked to speak at NASA centers a few times a year to talk about lessons learned from Columbia. His message to managers: Listen up. And to workers: Speak out.

Roger Boisjoly found himself shunned by colleagues and managers so he resigned from the company and became a speaker on workplace ethics. For his honesty and integrity, he received the Award for Scientific Freedom and Responsibility by the American Association for the Advancement of Science. Sadly, he died in 2012.

Safe Places To Work assesses the antecedents and consequences of psychological safety in the workplace. what if?  this assessment had been conducted within NASA before this series of unfortunate events? If you would like to discuss this assessment for your organisation, PLEASE CONTACT US

Leave a Reply