The Human Factors in Leadership Decision Making

Preparedness and response organizations have realized many benefits from adopting the Incident Command System (ICS) and similar formal management structures. Performance, however, depends on how people behave as humans within that system – particularly in stressful, fast-moving environments. Integrating behavioral training into ICS training may help improve performance and outcomes.

The crisis hits – a terror attack, an earthquake, or a mine collapse. The emergency operations center (EOC) activates and team members, trained in the Incident Command System (ICS), slide into their well-rehearsed roles. The response is underway.

The use of ICS, and its federal counterpart National Incident Management System (NIMS), has become the accepted standard. It provides structure amid potential chaos, clarity of roles and responsibilities, and direction for training and exercises. The common deployment of ICS facilitates coordination and cooperation across organizations and jurisdictions. However, ICS and most formal approaches to incident management have an “Achilles heel”: they undervalue the human factors critical to success in preparedness and response.

Arising as a management approach, ICS is built on the assumption of rational thinking and decision making. Nobel Prize winner Daniel Kahneman, Dan Ariely, and other behavioral economists have provided ample evidence, however, that people are much less rational than many would think. Absent an understanding and application of insights from psychology and applied neuroscience, leaders are likely to sub-optimize the effectiveness of ICS.

At the National Preparedness Leadership Initiative (NPLI), Dimension One of the meta-leadership framework and practice method is the person – understanding the self as a human and as a leader. Here are three of the most common potential decision-making traps rooted in self-understanding and how NPLI teaches leaders to avoid them.

The Amygdala Hijack

The basic operating system of the human brain is geared toward calculating risk and reward – the brain’s first job is keeping people alive. Whenever it senses a threat – whether being cut off in traffic or encountering actual gunfire – the amygdala ignites the freeze-flight-fight (Triple F) survival response. This instinctual mechanism is in everyone and numerous events in a busy EOC can activate a hijack. The problem is that a leader cannot reason or solve complex problems while in survival mode. They must reset their brains much like they reboot computers. Taking three deep breaths is a simple trigger script that recalibrates thinking by demonstrating self-competence. Here, ICS is particularly valuable because engaging in its practiced protocols serves a similar function for team members.

The lesson for leaders: Recognize the hijack, and be intentional and disciplined about countering it within and with others.

Cognitive Biases & Heuristics

The human brain processes mountains of data each day, most of it unconsciously. The way that it copes with this onslaught is through biases and heuristics – shortcuts that enable rapid function with accuracy that is “good enough” most of the time. Consider how little active thinking a person dedicates to drive to work. Unless something unusual emerges to attract attention, much of the activity is on auto-pilot. The person gets to work safe and on time while his or her brain preserves energy for more difficult tasks. There are dozens of these shortcuts that guide leaders’ thinking and decision making every day.

These biases and heuristics can also lead people astray. Confirmation bias, for example, leads to overweighing evidence that supports an existing world view. In the aftermath of the Boston Marathon bombings on 15 April 2013, leaders in Boston, Massachusetts, and Washington, D.C., looked at the same data and came to opposite conclusions. In Washington, leaders looked at the date – Tax Day and Patriots’ Day, and close to the anniversaries of the Oklahoma City bombing and Waco, Texas, confrontation – and initially thought that the perpetrators were home-grown terrorists. In Boston, feeling that they had good intelligence on potential domestic threats, officials initially thought the bombings were an act of an international group.

The lesson for leaders: Learn the most common biases and heuristics as awareness can help mitigate their effects in the intense back-and-forth of a response. Draw upon the different perspectives of peers and team members to help counteract individual bias and improve decision quality.

Tunnel Thinking

Leaders almost automatically narrow their focus when confronted by a disaster or crisis response. They look for what they can “fix.” Often, they may retreat to their operational comfort zones. It is essential, however, for a leader to see the bigger picture, discern the potentially divergent perspectives of multiple stakeholders, and anticipate the many secondary potential events that will unfold during the response: political, media, reputational, regulatory, etc. Each of these has distinct dynamics and may have interdependencies. The aspect overlooked can flare up unexpectedly – triggering an amygdala hijack – and distract or derail the leader and team members.

The tool NPLI provides to help leaders improve their decision making is the cone-in-the-cube. Imagine an opaque cube in which sits a cone. If a peephole is drilled in the top, the viewer will see a circle. If a peephole is drilled in one side, the viewer will see a triangle. Each viewer can argue vociferously that they have the correct observation based on his or her narrow slice of evidence. This is particularly true when someone has spent a career peering through a certain peephole or has an advanced degree in that peephole. They become invested in their perceptions yet neither viewer sees the full dimensions of the cone in the cube. This simple metaphorical tool helps the leader achieve psychological distance from the situation, making it possible to achieve more nuanced and complete situational awareness.

The lesson for leaders: No one has the complete answer, yet everyone may have part of it. Always ask, “What am I missing?” before making a decision.

Cone in cube
©2016, The Presidents and Fellows of Harvard College. All rights reserved.

 

Successful Application of Lessons

NPLI’s curriculum is neither perfect nor comprehensive. This brief overview of some of its components points to the relevance of brain function and behavioral tendencies to success in an ICS environment. The simple proposition is that these elements should be integrated into more ICS training. No matter how robust the management structure, it will still be populated by people. The better leaders understand themselves and others, the more effective their leadership.

Eric J. McNulty

Eric J. McNulty is associate director of the National Preparedness Leadership Initiative (NPLI). Leonard J. Marcus is the NPLI’s founding co-director. They are two of the co-authors of a new book on leadership: You’re It: Crisis, Change, and How to Lead When it Matters Most (PublicAffairs, June 2019). The NPLI is a joint program of the Harvard T.H. Chan School of Public Health and the Center for Public Leadership at the Harvard John F. Kennedy School of Government.

SHARE:

TAGS:

No tags to display

COMMENTS

Translate »