Decision Making and Disorientation: The Kobe Bryant Accident

It’s been just over one year since a Sikorsky S-76 crashed in Calabasas, California killing nine individuals including famous basketball player Kobe Bryant. On February 9th, 2021 the NTSB issued a news release announcing their assessment of the probable causes of that accident. While the events that occurred on January 26th, 2020 are tragic there are some important lessons that we can learn as aviators which we can apply to our own decision-making process. Below is an excerpt about probable causes for this accident:

“The National Transportation Safety Board determines that the probable cause of this accident was the pilot’s decision to continue flight under visual flight rules into instrument meteorological conditions, which resulted in the pilot’s spatial disorientation and loss of control. Contributing to the accident was the pilot’s likely self-induced pressure and the pilot’s plan continuation bias, which adversely affected his decision-making…”

Aeronautical Decision Making (ADM) is an important yet often challenging component of our involvement in aviation. While some seasoned pilots may suggest that effective ADM is just common sense, we should be weary of that line of thinking. Common sense is learned through experience and something we will continue to develop throughout our entire careers, there’s no such thing as “crossing the finish line” when it comes to refining our ADM skills. The introduction of Chapter 2, “Human Behavior” in the FAA Risk Management Handbook states: “The human element is the most flexible, adaptable, and valuable part of the aviation system, but it is also the most vulnerable to influences that can adversely affect its performance.”

Related to this accident are three areas we can focus on and learn valuable lessons from.

First is one of the components included in the “PAVE” checklist, presented in Chapter 2 of the Pilot’s Handbook of Aeronautical Knowledge (PHAK). To refresh, PAVE = Pilot, Aircraft, EnVironment, and External pressures. The NTSB cites the pilot’s “self-induced pressure” as a contributing factor to this accident. The “E” in our checklist reminds us why we may experience external pressures and some strategies to avoid falling victim to them. The FAA also says this: “Management of external pressure is the single most important key to risk management because it is the one risk factor category that can cause a pilot to ignore all the other risk factors. External pressures…figure into a majority of accidents.” No discussion about managing external pressures is complete without also considering certain “Hazardous Attitudes” such as Invulnerability and Macho (though the NTSB makes no mention of those being contributing factors in this accident).

The second topic we can focus on falls under the discussion of spatial disorientation, presented in Chapter 17 of the PHAK, “Aeromedical Factors”. In the investigation it is suggested that two vestibular illusions played a part in this accident: the leans and somatogravic illusion. Both illusions originate from our inner ear and send signals to our brain about the perceived aircraft attitude. In VMC these signals are backed up with our view of the horizon, and because our senses (sight and feel) can cross-check one another the vestibular illusions are minimized or non-existent. In IMC however, we lose our outside visual references and increase our susceptibility to these and other illusions. The FAA does not expect us to be doctors, but to maintain safety of flight we do need to have an understanding of the illusions we could be exposed to. What is the solution to overcome these? Trust your instruments!

Finally, we can also educate ourselves on the concepts of bias and how it influences our decision-making process. The FAA published an article “Just a Bit Biased: How to See and Avoid Dangerous Assumptions” in the July/August 2020 FAA Safety Briefing. In this article they discuss various types of bias, awareness of the types of bias is the first step to stop us from falling victim to them. “Some of the more common biases that affect pilots are expectation bias, confirmation bias, plan continuation error, automation bias, and automaticity.” HAA has seen students and instructors alike falling victim to bias, most commonly: hold short instructions, runway crossing clearances, pattern entry instructions, landing clearances (option vs. landing full stop), etc. Bias can creep up on us when we least expect it and as the article suggests it develops as a result of experience; following that logic CFIs may be more susceptible to bias than students! It takes constant awareness and discipline to not fall victim to these pitfalls.

So, what can we take away from this tragic accident to make us better aviators and professional pilots with a safety-oriented approach to our decision-making process? Whether student or CFI at a flight training school, we can remind ourselves of the valuable yet vulnerable human component in our flights. We can review PAVE before every flight to ensure we don’t easily succumb to external pressures. We can ensure we do not fall victim to hazardous attitudes. We can deepen our understanding of aeromedical factors associated with flying into IMC. We can acknowledge bias and maintain our mental discipline to avoid negative outcomes. And we can always aim for: TARGET ZERO.


AvWeb YouTube:
NTSB News Release:

NTSB Meeting Synopsis:

NTSB Accident Investigation:

Risk Management Handbook:

PHAK, Chapter 2 (Aeronautical Decision Making):

PHAK, Chapter 17 (Aeromedical Factors):

FAA Safety Briefing (Bias):