Disaster Series: The Failure of Lessons Learned (LL) and the High-Reliability Organization (HRO): Experience-Conceptualization-Contextualization 

Daved van Stralen, MD, FAAP, Sean D. McKay, Thomas A. Mercer, RAdm, USN (Retired) 

Abstract 

Lessons Learned programs are continually at risk for “conceptual arrest” – the Lesson Learned that is a concept, an abstraction, something that has not, and cannot, be contextualized. A disaster creates abrupt gaps between what we thought we could do, what we must do, and the urgent necessity to engage the situation. Lessons Learned convert these experiences into more effective organizational performance and improved capabilities of personnel. In these volatile and uncertain environments, a failure is an option. The belief that a dangerous context can be engaged with the same organizational and cognitive approaches as a stable environment leads to ineffective, even dangerous, Lessons Learned. Effective Lessons Learned come from an integrated program of full-spectrum analysis, appreciation of the crisis environment, and knowledge of stress and fear modulation. This is not a mixing of academic concepts with field experience. It is an integration of how concepts are used in crises – we have experience-conceptualization-contextualization. 

Introduction 

A disaster reveals gaps in the organization’s knowledge and experience. These are the gaps between the new disruptive environment and accepted theory or strongly held belief. The HRO brings order to the disruption, but this order comes more from a pragmatic rather than a normative stance. The nature of that order is rendered less accurately by a spectator’s concepts than by the insider’s detailed acquaintance (1). However, insiders have difficulty translating their experience in a manner that spectators understand. 

The HRO exploits newly gained experience to strengthen the organization or to engage in novel situations. Organizations and individuals that aspire toward high reliability seek the capability to extend their organization or themselves into new environments (2). For this to occur, they are well-served to incorporate novel and unexpected experiences into their knowledge. In this way, they can extend their capabilities and better understand their environment. The US military uses the Lessons Learned process to take deliberate corrective actions from the Lesson Learned to enhance performance (3). 

To do this, the HRO must cross the gaps between theory and practice, strongly held beliefs, and the environment. This is through engagement. During the review, the organization conceptualizes that experience for effective contextual application. The danger is “conceptual arrest,” the organization accepts the produced concepts independent of the ability to use the concepts as contextual actions. 

“Concepts can be ‘counterfeit abstractions’ that imitate some, but not all, of the differentiated flux produced by attentive HRO practitioners, (Karl Weick, personal communication). Alfred North Whitehead (4) described “the accidental error of mistaking the abstract for the concrete. It is an example of what might be called the ‘Fallacy of Misplaced Concreteness.’” Nevertheless, concepts can be intentionally used as concreteness. William James (5) described how abstraction “becomes a means of arrest far more than a means of advance in thought.” It becomes too easy to substitute a virtual world created from concepts for the actual world by the fallacy of abstraction. 

Through the Lessons Learned process, abstractions and concepts function to enhance performance. As “misplaced concreteness,” on the other hand, they create certitude and a misplaced sense of mastery, safety, and security. 

“The fox knows many things, the hedgehog one great thing,” Archilochus, Greek Poet. Experts who are confident yet poor predictors are the hedgehogs. Their certitude, accepted as mastery, gives their followers a sense of safety and security during the flux of uncertainty brought by a disaster or other crisis. The hedgehog extends one theory to many domains, defending this approach by following Occam’s razor – the parsimony of having one theory overrides the numerous theories other people use. Hedgehogs do not entertain the idea that other views may be correct (6). 

Experts with exceptional results at forecasting (“super forecasters”), the foxes in Archilochus’ fable and Philip Tetlock’s model, know many things but to a far lesser degree than the hedgehog. Superforecasters rarely forecast and adjust predictions as they receive the new or updated information. They readily admit to being wrong. 

Foxes are self-critical and use a point-counterpoint style of thinking that sustains doubt while reducing excessive enthusiasm. Foxes understand that opposing and contradictory forces yield stability, a feature that also confounds prediction. Superforecasters are diligent in pursuing information, updating their information, and revising their conclusions as more information becomes available. They have a greater tolerance for operating under uncertainty, easily learn from their mistakes, and improve their ability to forecast over time (6). 

Though it would seem reasonable to defer to the super forecaster, more commonly, people defer to the certitude of the fox. 

Explanations we can readily understand help with coping during difficult situations when there is a lot of uncertainty and contradictory information. Well-understood concepts give simple answers and will support a sense of certitude, particularly when it comes from an accepted expert. Unsatisfied or frustrated psychological needs can influence people to turn to straightforward concepts they can more easily understand. 

In ecology, “ecology of fear” describes interactions of predator-prey that occur in the absence of the predator (7). The ecology of fear develops when the absence of the threat paradoxically causes more severe and wide-ranging problems than the direct presence of the threat (8). 

Respect for certitude and the creation of local or central “hedgehog” experts becomes inevitable for the organization whose leaders unintentionally propagate the ecology of fear. Concepts give a feeling of safety and security, attracting some people to mistake abstract concepts for concrete. Those who need to feel knowledgeable will use the ‘Fallacy of Misplaced Concreteness’ to become accepted as an expert. Consequently, when the ecology of fear creates an environment that supports certitude rather than increases capabilities, individuals professing certitude undermine the Lessons Learned process. 

Identifying well-structured, tightly coupled concepts support a sense of mastery over a conceptual order. The resulting tractability of thought and sense of security from predictability supports using concepts when preparing for or working in a disaster. This becomes seductive to accept conceptual arrest. However, we then sacrifice reliability for the tractability of thought and the security from predictability. 

Forcing functions, described by the “color of noise” (9, 10), confound the predictability necessary for effective plans and planning. Commonly referred to as the “fog of war,” forcing functions and abrupt catastrophes result from autocorrelations that develop from internal feedback within a system (11). 

Not by concepts but by engagement do we increase our understanding of the environment. High reliability is seldom a heavy-handed application of a conceptual order. Instead, it gets worked out during activity through small activities with more significant consequences. A practical domain of engagement recognizes the overlapping and loose coupling of concepts necessary to complete a task and illuminates the study of the problems of transferring academic work to organizational practice (1). 

The Lesson Learned process can enhance performance following a disaster through gap analysis. However, conceptual arrest, the sense of certitude, and the drive to master concepts can interfere with or impair the use of Lessons Learned to support adaptability, reliability, and engagement. 

In a disruptive, confusing, and volatile situation, the analysis of the situation, the search for patterns, and attempts to create structure teach people how to engage and create stronger designs to prevent system failure effectively. Traveling backward in time to attribute specific reasons for an incident develops sturdier structures for our future but with a hidden bias directing individuals to pursue pathways that make sense rather than have authentic causality. The Lessons Learned process frames the incident with the pragmatic stance for introspection, examination of capabilities, identifying early heralds of failure, possible actions, possible responses, and who can help. 

Captain Chesley “Sully” Sullenberger’s river landing of a passenger jet revealed the strength of HRO operations when he solved a problem that he did not know he had and the difficulty of identifying what worked and what did not. If framed as a normative incident, the water landing applies to engineers, pilots, flight crews, and passengers; if framed as a pragmatic incident, the water landing applies to all of us because unexpected incidents are a part of living (1). 

Capt. Sullenberger was trying to increase the angle of attack as much as possible just prior to touchdown before the aircraft stalled in order to maximize the flare and thus minimize the airplane’s downward velocity when it impacted the water. His effort was frustrated because the phugoid damper prevented him from getting the last 3 1/2 degrees of nose-up pitch that would otherwise have been available before stall. Consequently, the sink rate was higher than it otherwise would have been, and the rear fuselage structure was breached to the extent that a flight attendant seated in the rear was injured, and water entered the airplane. Automation that was intended to improve safety and comfort actually hindered the most adaptable part of the system, the human pilot. Sully was not aware of this until we discovered it in our investigation. [Emphasis added by the authors] 

Christopher A. Hart, former Chairman NTSB (personal communication)

A convergent, deductive, analytic approach drives the search for facts and information, which will then guarantee our conclusions – the nature of deductive reasoning is that the data guarantee the conclusion. The security offered by the structures we create and our actions reinforces the normative frame. The results are narrowing and increasingly confining, destined to cascade into destructive failure when the environment intrudes into the problem. As in Sullenberger’s water landing, a pragmatic frame enhances our capability to solve problems linked to deeper, unidentifiable structures. 

The Disaster Environment 

Spectators far from events are most likely to direct the Lessons Learned process, developing then implementing the Lessons Learned. However, the information necessary for effective Lessons Learned comes from participants operating within the event. Participants with limited experience in dangerous contexts will search for the familiar and seek the security of homeostasis. They are less likely to search the discordant event to identify the threat, thereby supporting safety 

These insiders have difficulty translating their experience in a manner that spectators understand. These disruptions create a novel VUCA-2T environment (Table 1) and produce an unfamiliar liminal experience (Table 2).

Table 1. VUCA-2T (12) 

Volatility A rapid, abrupt change in events 
Uncertainty Lack of precise knowledge, need for more information, unavailability of the necessary information 
Complexity A large number of interconnected, changing parts 
Ambiguity Multiple interpretations, causes, or outcomes 
Threat Impaired cognition and decision-making 
Time Compression Limitations acquiring information, deciding or acting before consequential changes 

Table 2. Liminality (12) 

Conventional Operations Liminal Operations 
White noise 
Closed system 
Red and pink noise 
Open system
“Cosmology episode” (13) 
Familiar 
Structured 
Threshold of Transition 
Passage for travel, but not traveling 
Knowledge by description Gaps in knowledge (14) 
Hierarchical support Alone 
Standards 
Known rules 
Familiar relations 
Learn by doing 
Old rules do not apply 
New rules unknown 
Prevent Failure Consequence driven 
Euclidean space 
Newtonian physics 
Topological space, but learning relations 
Collapse sensemaking and leadership (13) 

VUCA-2T 

The military concept of “VUCA” to describe threats to national security (15, 16) and the anthropological experience of liminality as a transition (17) have entered the lexicon of business, public safety, and healthcare. The incomplete translation of VUCA, liminality and HRO theory into the practice of reliability and safety comes with the loss of nuance and missed, subtle cues within the environment (1). The loss of neuromodulation as a skill, and methods for its acquisition, are unrecognized. 

A threat more often takes place out of sight of the public. People then reject or misunderstand how a threat forms HRO and how it makes a program stronger. Threat acts as a motivating condition to generate a set of beliefs and behaviors from within the ranks of workers, relatively independent of the organization or regulatory agencies (18). 

In 1995, US Army researchers working in the Carlisle (Pennsylvania) Barracks, US Army War College, described the global environment that had developed at the end of the Cold War as VUCA: volatile, uncertain, complex, and ambiguous (15, 16). The concept of threat is not included because, for the military, the threat is expected. Therefore, the threat was not translated into civilian applications (19). On the other hand, one of the authors (DvS) included time compression into the element of volatility as a quality of instability (19, 20). A particular group in SOCOM (Special Operations Command) used “VUCAT” but found the element of time compression to have such importance that it should stand alone (SDM). We now use VUCA-2T to describe the physical environment of a disaster (21). 

Liminality 

The environment is the enemy in a liminal space. When structure and activity become random, we lose context. Unfamiliarity and loss of context become disorienting or overwhelming. In the disorientation of the liminal space, knowledge and facts suffer (22). Organizations operating in the liminal space that rely on strong leaders of whatever style or philosophy risk failure from abrupt changes (23-25). The liminal space is not an environment where monitoring, sensemaking, or attention can help us. 

The liminal zone described in anthropology is that space between a world we know and a world we do not, where our old rules do not apply and we have not learned the new rules (17). In this area of experience, we must engage the situation to leave, yet, we do not know what works (22). The common themes across work domains include suppression of fear, trust, helping the novice, protecting your partner, recognizing fear in fellow workers, and local leadership. The organization’s response to liminality differentiates management by high-reliability organizing from conventional organization management (12). 

Reasoning 

In the liminal space, we do not have certainty, particularly for antecedent events, which impairs our ability to act on what happened earlier. This also interferes with scientific logic and the Cartesian approach to reaching truth (26) or bringing about a resolution. Instead, we use the possible consequences of our actions to guide inquiry, likely the most challenging tenet of pragmatism to appreciate (27). 

In the liminal space, constant observation and reciprocal feedback, generating and testing information, all rely on inductive processes. Leonhard Euler (28) describes the problem this creates: 

We can place our highest hopes in observations; they will lead us continually to new properties which we shall endeavor to prove afterwards. The kind of knowledge which is supported only by observations and is not yet proved must be carefully distinguished from the truth; it is gained by induction as we usually say…Indeed, we should use such a discovery as an opportunity to investigate more than exactly the properties discovered and to prove or disprove them; in both cases we may learn something useful. 

We have only observation to adapt our knowledge to emerging facts. This drives inquiry, described by John Dewey as “always a behavioral response of a reflective organism to its environing conditions….inquiry belongs to ‘action or behavior,’ which takes place in the world, not just within the mind or within consciousness…. Inquiry…is what Dewey termed an ‘outdoor fact’” (29). 

“Operators are maintained in [complex technological] systems because they are flexible, can learn and do adapt to the peculiarities of the system, and thus they are expected to plug the holes in the designer’s imagination,” Jens Rasmussen (30). 

Information and Communication 

“During a crisis, there is no time to think about each specific bit of knowledge or experience that we depend on to make sense of imperfect information and ambiguity. But having those resources immediately accessible in our minds, we use them in a conceptual decision-making process to frame the decision. We essentially quickly come up with a paradigm of how to solve the problem. It is after the fact that we retrospectively begin to attribute specific reasons for the decisions that we made.” 

Capt. Chesley “Sully” Sullenberger (personal communication)

For thermodynamics and information, entropy is a state measure on the spectrum between certainty and uncertainty. Information Theory uses entropy to describe increases in uncertainty from random internal or external sources. For Shannon, the act of choosing between messages creates information. Certainty is having only one message possible, no choice, and predictability; therefore, certainty carries no information. This creates a counter-intuitive statement: uncertainty is information. 

When we choose from randomness, we create information (31). We have many choices in the VUCA-2T environment, meaning VUCA-2T has high information entropy. By engaging the situation, we generate information through our choices (32). In the VUCA- 2T environment, information is transient and must be constantly generated. 

Communication is the act of resolving this uncertainty. Communication describes the process of encoding information in the environment, transmitting that information, then decoding that information by someone removed from the situation. The corruption of communication can occur at any point along the sequence. Information entropy increases as information is corrupted, as measured from certainty (order) towards uncertainty (randomness). Communication corrupts information (31). We must not blame the person for communication failures (32). 

Language 

Language operates in environments where environmental “noise” corrupts communication. This corruption occurs not only within the event but during Lessons Learned analysis. A structure in the language must support honest communication and contain some form of redundancy for reliability. Message feedback for calibration reduces the effect of noise (Karl Weick, personal communication). 

In the VUCA-2T environment, language is also caused by communication due to the separation and distance between individuals who must encode information for transmission and the receiver who must then decode the information. HROs have communication methods for operations in noisy environments. 

Lexicon, as the vocabulary of a discipline, is too coarse of a concept to appreciate the elements of a language in noisy environments. Failure to appreciate this contributes to corrupted information necessary for effective Lessons Learned. Lexical elements are those basic elements of a language that can carry meaning. For example, actors use faces and facial expressions, musicians use notes, and academicians use concepts and theories. Conflicts develop when we talk from different lexical elements. For example, doctors focus more on diagnoses, while nurses focus more on treatments. 

In the 1970s, an ED nurse rode along with fire medics. She asked one of the authors (DvS) during his service as a fire rescue medic how the medics knew to bring certain equipment into a home. In the discussion, he realized how much communication the medics had with facial expressions. During interviews for a report on fire response to a mass shooting, all interview teams had at least one person with extensive experience operating in dangerous contexts (20). The team identified numerous accounts formed into accepted practice or commonly used phrases. This led to Lessons Learned that differed from published accounts for similar incidents (33). 

The foundations of lexical elements are “those aspects of meaning which have consequences for the syntax,” which are made operational by “those bits of perceptual and cultural knowledge that form the bulk of the lexical representation” (34). Lexical elements have form, function, and meaning, just as words do. They have the same combinatorial characteristics of words and contribute to concept creation. 

Lexical elements carry action, description, persuasion, and interpretation. The structure of lexical elements has cultural use and will incorporate whether the culture relies on the environment and context for information. For the environment, we must be descriptive (it is difficult to describe a mountain in the abstract). This exactness in lexical elements leads to the outcome-oriented pragmatic stance compared to the process-oriented normative stance. 

People can become limited by the frame of reference, lexical elements, lexicon, and concepts. This concept influences how they use their understanding to translate hazards – to fit the hazard into their understanding or to use their understanding to extend operations into the hazard. The danger of using clichés and metaphors is that the metaphor becomes real, and the concept is treated as reality. This is beyond experience and inside the liminal. 

The Color of Noise 

Disasters bedevil attempts at planning and prediction. The ‘color’ of environmental noise describes the effect of periods on the environment. Without feedback, time segments and elements are independent of each other, hence the Gaussian distribution and calculated statistics and probabilities. The presence of feedback in a system causes autocorrelation, shifting the spectral frequencies from white to reddened noise. Low-frequency events bring a more significant force into the system (Table 3). 

Stable environments measurable by the gaussian distribution are “white noise” environments. Environments, where stability is at risk from external forcing functions are “red noise” environments (10). The interaction between the orthogonal axes of abstractions and contextuality produces periods that have stability or have the appearance of stability. Over time this appearance of stability can reset the perceived baseline of what we can expect or should prepare for (35). However, these environments are punctuated by instability that seems distant until it arrives, then it appears to have been a logical consequence of events (Table 3). 

Table 3. Patterns and Characteristics of Noise (15) 
Color Structure Variance Distribution 
White No frequencies dominate  Flattened spectrum Data decreases variance Gaussian distribution  Elements fully independent No autocorrelation 
Red Low frequencies dominate  Long-period cycles Data increases variance Power law distribution  Elements not independent Mutual/ reciprocal relations 
Pink The midpoint of red noise  Slope lies exactly midway between white noise and brown (random) noise Data continuously increases variance  Distinguishes pink noise from reddened spectra Power law distribution  No well-defined long-term mean  No well-defined value at a single point 

Reddened or pink-noise environments are information insensitive. More information (or data) makes the data messier or reveals covert, unexpected influences. With events in flux, current information quickly becomes antecedent information, entrained energy changes circumstances, and what was once relevant becomes irrelevant. We operate more in a mystery, searching for and testing clues using a complete spectrum analysis (36). Our drive is to prevent possible consequences from becoming a reality. 

Red noise environmental ‘forcing functions’ drive environmental influences into the organization, destabilizing the internal environment. Problems become contextually resolved by practical, pragmatic solutions. The pink noise environment is also ecological, but the problem is embedded into the environment, making these problems contextual and pragmatic (37). Problem-solving for red and pink noise environments tends toward practical common sense, focusing on consequences and a broad knowledge base (38). 

Decision theories and problem-solving developed in white noise environments tend to be information sensitive, linear, and deterministic. Within the forcing function of red noise or the abrupt catastrophe from pink noise, white noise methods become the problem. Using white noise predictions of what would happen during forcing functions or catastrophes could become a matter of life or death literally – inaccurate models could kill (39). 

A system can become trapped when the white noise interludes between events. Established veterans may misattribute the absence of forcing functions to effective structures and operations. New arrivals set their standards for operations at the level when they entered the field. What then happens is the baseline shifts (35) toward consideration that a white noise environment is normal. Disinterest in past capabilities leads to the loss of history and memory as the organization enters a period of shifting baselines

Certitude is no longer an early herald of failure and becomes respected. The doubt of the veteran is a sign of constrained competence. Lessons are not learned. 

Forcing Functions 

The organization and local groups maintain continuity against these forcing functions through malleability; in topology, this characteristic is called deformability – relations can be deformed but can never be destroyed. The HRO uses LLs to identify and strengthen the relations that contribute to the organization’s and individuals’ capabilities. This process can be considered a variance of resilience, another characteristic of HROs (40). 

The disaster environment is information insensitive. Data accumulation and aggregation will increase variance rather than decrease it. At some point, increasing data increases confusion. action during the event generates information as it slows the activity rate and creates structure toward a whiter noise environment. 

A less well-recognized effect is how new properties emerge from self-organizing local elements. 

Every moment is a new moment of evolving vulnerability. It is this activity at the local level that hinders accurate descriptions of events, particularly as cause-and-effect. Sullenberger described this in the quote above, “It is after the fact that we retrospectively begin to attribute specific reasons for the decisions that we made.” 

Engagement 

Engagement is the act of approaching and entering liminal spaces. In these situations, sometimes all we have are observation and action (41). Engagement describes actions taken without certainty that they will succeed (22). Engagement describes the approach and experience when the operator does not know what will work. “I don’t know what is happening, but I know what to do.” – said a Los Angeles Fire Department firefighter. “HRO uniquely shapes the engagement that moves through and out of a liminal period,” Karl Weick (personal communication). 

We question if plans and planning, the most commonly accepted methods, are the most effective approach to entering or exiting a liminal space. The hallmark of liminal space is the uncertainty of what actions will be successful. This is what makes planning difficult (22) and what gives value in augmenting capabilities and developing reasoning for these environments. 

We have described how engagement bridges the gap between theory and practice (42) and between discrete concepts and continuous perceptions (43). Engagement bridges the gap between abstractions and details (Karl Weick, personal communication). Engagement makes use of the nuances and subtle differences in details. Details can herald an early response to therapy or be an early herald of failure. Yet, focusing on details without context is the definition of micromanagement details (Karl Weick, personal communication). 

Engagement is the act of learning by doing in context, not an outcome of rational deliberation, and cannot be objectified for theory-making (44). Engaged action comes from insight and immediate feedback, with negative feedback marking the safe boundary of performance and positive feedback generating growth. All feedback generates information. Effective responsiveness brings strength through change and allostasis. 

The act of engagement bridges conceptual gaps and extends responsiveness which, in turn, forms prevention and generates resilience. We describe four domains of engagement (Table 4), sets of properties with functions and characteristics that differ between stable environments (or environments where we can expect stability) and abrupt change or liminal situations (2). 

Table 4: Domains of Engagement (2) 

Domain Personal Organizational (non-HRO) 
Categorization Personal experience is a frame of reference
Translate the situation into familiar terms 
Standards to reduce diversity and variability (45) 
Decision making Reciprocal feedback loops 
Error as limits of knowledge, boundaries of performance 
Algorithms  Protocols 
Affective processes Attitudes and values 
Contextual nature of the information 
How information flows and is used 
Focus on cognitive science 
Decontextualized plans 
The central organization of information 
Modulated stress and fear responses Recognize the inherent vice of stress and fear
Maintain adaptive thought 
Rely on hierarchy 
Develop structure

Gap Analysis 

The nature of disruptive events interferes with the use of accepted methods of gap analysis for operations. The gap forms between the novel experience and one’s personal experience, knowledge, and firmly held beliefs. The liminal nature of these experiences limits how one describes events. The turbulence of the VUCA- 2T environment limits the documentation of environmental influences. 

When we apply concepts to situations, we act in the top-down direction – from the abstract to the contextual. Abstractions can readily be applied to situations in various contexts with little ability to verify fidelity to the situation (46). When this approach is given priority, we risk treating the abstract concept as something concrete, Whitehead’s “Fallacy of Concreteness” (47). 

Contextual influences tend to operate from the bottom up. Both are developed through gap analysis from conceptualization through to contextualization. Both can become arrested when identifying agreeable concepts interferes with identifying methods to contextualize the LL. 

Well-meaning professionals overlook or leave behind the practical, bottom-up nature of HRO that produces its pragmatic strength. Responsiveness to rapid, nuanced, or subtle changes in the environment occurs at the level of the individual, hence the bottom-up characteristics of HRO (48-50) 

A fundamental problem of gap analysis is whether to conform to accepted concepts and models, in effect, conceptual arrest or to extend understanding through experience-concept-context-application. 

Organizations seeking to increase reliability and decrease error look to methods of deductive analysis, scientific logic, and critical thinking. Unintentionally, this supports certainty, disregards ambiguity, and supports deterministic, linear problem solving, more like puzzle-solving, where knowable information fits together to produce the right answer (36, 51). A conceptual approach induces problems to become linear, puzzle-solving processes. We utilize only the concepts we can conceive. This approach carried grave consequences (36). 

The belief that knowable information enhances decision-making contributes to the collection of more information and increases confidence, but with little change in accuracy (51). Rather than a puzzle solved by identifiable, knowable information, we can approach the problem as a mystery where generated information generates new decisions (36). 

The inclusion of uncertainty widens our operational environment, making available a fuller spectrum of analysis and the pursuit of weak but salient signals (36). To John Boyd (52), a US Air Force officer and strategist who created the OODA (Observe, Orient, Decide, and Act) Loop, the analysis served to differentiate concepts, a trait particularly useful during disruptions or for complex situations when he would combine analysis with synthesis – destructive with constructive cognitive forces, for his model Destruction and Creation

Boyd also considered these problems a dynamic mystery rather than a static puzzle. Adrian Wolfberg (36) describes a puzzle as having one right answer you reach by obtaining and properly placing the puzzle pieces. If necessary, we can figure out the missing pieces within the puzzle matrix. We then solve the puzzles sequentially until we resolve the problem. 

Wolfberg demonstrated that, rather than puzzle-solving, we use Boyd’s concept of mystery for “mystery-solving.” This relies on “full spectrum analysis,” many lines of simultaneous engagement as events unfold across a full spectrum of possible actions. Multiple challenges can best be solved in an integrated fashion to create synergy among disparate domains. “In full-spectrum analysis, the analyst examines not only multiple, possibly interrelated intelligence problems simultaneously, but also considers contextual and influential factors that could affect the interim analysis of information and its interpretation” (36). 

There are multiple interrelated mysteries to be solved simultaneously. The solution lies within many possible explanations or overlapping pieces of explanations. As we investigate some mysteries, data may not become available until after we decide (36), which is common in critical care and disasters. 

Models give mastery, pieced together like a puzzle without overlap or gaps. Concepts are abstract. While experience is understood through concepts, experience also reinforces and improves concepts. 

Tightly coupled concepts support expert formation and mastery but are at the risk of creating “hedgehog” experts who know one thing well that they apply to all situations (53). The linear structure allows matching resources to disturbances and solving problems much like a puzzle – finding the pieces for their proper space (36). Problems match the well-structured problem amenable to algorithmic solutions, as Herbert Simon (54) described. 

The stability of a white noise environment permits context-free concepts and problem-solving, placing greater significance on classifications (45) and abstractions (46). System change occurs over generations in an evolutionary manner rather than context-dependent ecological processes. Leadership is less important than executive, administrative, and managerial skills (54), where ‘categorical work’ creates classifications and rules to work by (45) [see below]. 

Identifying well-structured, tightly coupled concepts support a sense of mastery over a conceptual order. The resulting tractability of thought and sense of security from predictability supports using concepts when preparing for or working in a disaster. This becomes seductive to accept conceptual arrest. However, we then sacrifice reliability for the tractability of thought and the security of predictability. 

The HRO must cross the gaps between theory and practice and between strongly held beliefs and the environment. This is accomplished through engagement. During the review, the organization conceptualizes that experience for effective contextual application. The danger is “conceptual arrest,” the organization accepts the produced concepts independently of the ability to use the concepts as contextual actions. 

Evaluating Motion and Continuity 

We generate LLs from either outside or within the event. Both views have great utility for learning in the HRO. From the outside, we choose a position in space or time that gives a “whole field view” of the evolving disaster. When viewing from the inside, as a “local grouping” of people would experience a disaster, we select a starting place. From the starting place, we observe the local effects of the event on the local grouping. We can later aggregate local information to develop a larger field view. 

The whole field view observer at a fixed point detects the local group only if the group is visible as it passes by the fixed point. The local grouping observer moves with the group through the flux of events. The whole field view of the flow of events is an aggregate of local flows and velocities experienced by local groupings. Local groupings are not an immediately compelling consideration in the full field view specifications but are fundamental for local groupings specifications.

The outside view is too quickly taken as the top-down approach; the view inside, the flow of events, is assumed to be a bottom-up view. This understanding is too simple. A top-down approach develops when concepts or abstractions from a centralized authority guide action, while the bottom-up approach develops when contextual, local actions influence the centralized authority. This HRO characteristic is distributed authority or deference to expertise (40). The two views are different levels of analysis. Arguing across levels of analysis creates false debate (55). The mistaken belief that the two views are not related or in conflict may lead to inaccurate models that, not tested in the VUCA-2T environment, can be deadly (39). 

We can better understand these views not as directional influences but as specifications from outside or within the flux of events. The “whole field view,” from outside the flux of events, observes a specific area from a fixed position, though the “fixed” position can be moved to increase the scope of the field. Whole field observers primarily use location and time static coordinates as independent variables. 

A “local grouping” specification refers not only to the group’s positions within the flux of events but to the group itself as an independent unit, including its actions. Within the flux of events, a local group becomes deformable; therefore, the position of the grouping is more important rather than the size of the group. 

Boundary objects, singular objects used in both views, support effective and efficient interactions (45). For example, the rate of change of events, a forcing function, is shared by both views. The rate of change creates a trajectory that is observable in the full field view specification to which the organization must respond. It also creates local changes to which the local group must respond as a local group specification. 

Quantities for whole field view specifications are measured primarily as rates and direction of change and spatial distribution, but at an instant during the evolution of events. The relative simplicity of this formulation can result in analysis that produces groupings of data. Statistical analysis and probability calculations benefit when these groupings are independent of other data groupings. Concepts and models can then be developed and refined. 

For local grouping specifications, the group’s identity becomes an independent variable. 

  • This form emphasizes changes to the state in a frame of reference that moves with events. 
  • The primary measurement of change is the velocity of change rather than the physical direction of movement. 
  • Entropic changes within events cause actions of the local group. 
  • The velocity of events and pressure on the local group are variables within the event. 

The whole field view specification formulates movement as static coordinates that can also apply to local groupings. Local grouping specifications have coordinates that move with events. 

The whole field view, outside the flux of events, is more amenable to reliance on concepts and is tolerant of abstractions. The contextual nature of local groupings, from within the flux of events, is not tolerant of abstractions. Rather, abstractions can be dangerous and can kill (46). 

Lessons Learned 

We too readily consider cognitive approaches as the more objective approach compared to affective processes, derisively considered “emotional.” Lost in this view is how we operate in the VUCA-2T environment – through the value of information in an unstable environment. Mastery of concepts is of less use where information is in flux, the value of information shifts, and we do not know what decision information is or will be necessary – even after the decision is made and acted on. In effect, the affective domain conforms cognitive knowledge to the changing situation, real-time adaptation to a red noise forcing function. Affective judgment becomes more critical to sensitize one for detecting subtle or nuanced threats and hazards and the salience of early heralds of failure. 

The motive for studying Lessons Learned came from changes in the world and added information. We evaluate and judge information for the development of Lessons Learned. Judgment does not have an external auditor; we use our judgment to judge our judgment and take offense when someone questions our objective judgment. This is not a trite observation. For decades, one of the authors (DvS) has advised students and residents how to introduce disconfirming information to a superior – using the belief update operator from the doxastic modal logic (56, 57). 

Epistemology has two aspects: the definition of knowledge and its logical inferences. The two logical inferences are epistemic knowledge logic and doxastic belief logic. Doxastic logic provides reasons for belief rather than knowledge. The difference is that a belief is probably not necessarily true. Doxastic operators capture belief change as “belief updates” (the world has changed) and “belief revisions” (we have added information). 

  • A belief update refers to accounting for a change in the situation and acquiring new, more reliable information; this requires us to change our inaccurate old beliefs to more accurate, new ones. 
  • Belief revision occurs when we identify the old information as being less reliable and use new, more reliable information to revise our older beliefs; we keep the new belief as close as possible to the old belief while accepting the newer, more accurate information. 

The judgment of people for Lessons Learned is not entirely objective. Faced with uncertainty and contradiction, people often prefer broad, internally consistent explanations that preserve their beliefs. Lessons Learned that will initiate change may be construed as criticism of accepted operations, conflicting with the strong positive feelings people have toward their group. A few organization members seek external validation to maintain a strong positive image of themselves. They may resist Lessons Learned or adapt them to support their belief in themselves (58). 

Lessons Learned that contradict predispositions, or previously held worldviews will likely be over-scrutinized. Motivated reasoning (59) describes the response when conflicting or disconfirming information challenges held beliefs or identities that are held closely. The individual then over-scrutinizes information that conflicts with those beliefs and too readily accepts data that supports the belief (57). 

Systems respond to internal feedback (autocorrelation), making red noise-forcing functions inevitable. Human behavior also responds to feedback; therefore, human behavior operates in the reddened spectra. We cannot predict how people will respond to a crisis. Individuals themselves cannot predict how they will respond. While a top-down engineering structure can produce neatly structured Lessons Learned, the red noise of human behavior prevents such orderly execution. 

In Neonatology and disasters, we encounter numerous critical gaps: 

  • The newly disruptive environment and accepted theory or firmly held belief; 
  • The pragmatic stance and the normative stance; 
  • Theory and practice; and 
  • Firmly held belief and the environment. 

Tetlock’s metaphorical foxes understand that opposing and contradictory forces yield stability, a feature that also confounds prediction. In minor situations, we readily negotiate these gaps. In the critically ill neonate or during a disaster, even minor levels of stress impair cognition, and we must then operate with the inherent vice of stress-induced disorders, fear circuitry disorders, and amygdala-driven behaviors (60, 61) 

Engaging these gaps relies on opposing and contradictory forces that can initiate stress-induced disorders, fear circuitry disorders, and amygdala-driven behaviors. It is these consequences that Lessons Learned addresses. 

Bringing spectators and insiders together makes the effects of different frames of reference visible: the whole field observer and insiders within local groups. This is more than a view of where one stands. The VUCA-2T environment is not amenable to reductionist methods to create linear vectors for causation. The effects of liminality hide the effective cognitive actions taken during crises. Stress-induced disorders, fear circuitry disorders, and amygdala-driven behaviors lead to justifications and explanations that impair the identification of necessary problems and solutions. Insiders have difficulty translating their experience in a manner that spectators understand. 

While not comprehensive, this article discusses some of the impairments that have prevented the identification and implementation of effective Lessons Learned programs in healthcare (Table 5). 

Table 5: Impaired Lessons Learned 

Standard SpecificationsLessons Learned Specifications
Environment 
Stability readily regained  Assumption of white noise VUCA-2T  Red noise-forcing functions 
Gaussian distribution Unpredictability 
Information from observation  Data-driven Information is generated  Information insensitive 
Specifications 
Fixed point observation, the spectator’s view  A frame of reference is stationary Within the flux of events, the insider view  The frame of reference moves with events 
Whole field view as the primary reference Local grouping as the primary reference 
Top-down approach Synthesis of top-down and bottom-up approaches 
Specifications primarily are rates and direction of change and spatial distribution Specifications primarily are changes to the state  The velocity of change, not the direction of movement 
The analysis produces groupings of data  Statistical analysis and probability calculations The analysis is difficult due to local entropic change  Local events cause actions of the local group 
Concepts 
General concepts Detailed acquaintance 
Safety and security from mastery of concepts Safety and security from capabilities 
Error from mistaking abstract concepts for concreteness Error revises and corrects concepts 
Categorization guides sensemaking Sensemaking creates new categories 
Reasoning 
Puzzle-solving  Limited spectrum analysis Mystery solving  Full spectrum analysis 
Certitude Doubt 
Certitude and confidence yield poor predictions Opposing, contradictory forces yield stability 
Deductive reasoning – facts guarantee the conclusion Inductive reasoning – facts are constantly re-evaluated 
Classical logic The logic of practice, modal logics, paraconsistent logic 
Decision-making linear and deterministic Decision-making from reciprocal feedback 
Reductionism Nonlinearity 
Cognitive processes Affective processes 
The “ecology of fear.” Modulated stress and fear 
Communication, language 
Lexical elements – concepts, words Lexical elements – body language 
Information entropy Communication for noise 
Reset baseline The past as Lessons Learned

During the thirty years from 1921-52, mountain climbers from eleven expeditions to Everest failed to climb higher than 27,000 feet. One year after the physiologist, Gifford Pugh, joined the effort, Sir Edmund Hillary and Tenzing Norgay reached the summit of Mount Everest, smiled, removed their oxygen set, and took photos (62). From Eric Shipton’s list of studies (63), we know high-altitude climbers were cognizant of the problems impeding success and were familiar with the science of human performance in low-oxygen environments. 

The expertise of Mount Everest climbers produced articulate, accurate observations of the problems they encountered and characterized the need for better science (64-67). The expertise of scientists produced a better understanding of environmental hypoxia and engineered technology for oxygen administration that could be readily adapted from military aviation (39, 68, 69). However, just as a gap exists between theory and practice (70) or discrete concepts and continuous perception (71), a discontinuity, a separation, exists between the protected, well-controlled laboratory study and the dangerous, volatile high-altitude environment (62, 72). 

Lessons Learned developed by spectators or solely by the whole field observer will not contain the necessary nuance for the VUCA-2T environment nor support the individual crossing the liminal threshold. As in the story of Mount Everest or our experience interviewing participants in shooting incidents, effective Lessons Learned develop when subject matter experts participate (3) and contributions come from individuals with academic knowledge and field experience to guide the program. 

Concepts without context cannot predict consequences. Experience without concepts cannot predict what may work. Individuals familiar with concepts tempered by experience and experience extended by concepts act as a resource 

Conclusion 

The introduction of HRO into healthcare and the development of patient safety programs has taken place over the past two decades. From published accounts, it is hard to tell if HRO is being incorporated into the healthcare system but what is missing is the effective processes for decision-making and the lexical elements to operate in a noisy environment. 

HRO gets worked out by utilizing small activities with more significant consequences (Karl Weick, personal communication). In these small activities and mundane situations, vigilance for early heralds of failure lay the beginnings of high-reliability operations. That is where the organization engages in ‘covert compensated states(19). For Karl Weick, this is “what happens when the autopilot is turned off and you ‘discover’ first-hand what forces had been held in check and balanced automatically. Isn’t that what happens when habits suddenly break down? Covert compensation may be somewhat of a synonym for habit, routine.” 

Lessons Learned can apply experience gained in a crisis or catastrophe to routine operations – vigilant for the sudden breakdown of habits during red noise forcing functions. For this to occur, the Lessons Learned approach must not only move into the field but must also incorporate the lexical elements necessary to communicate in a noisy environment. 

Not by mastery of models nor expertise in operations will an organization achieve HRO. HRO emerges from the practical application of science in a particular context. Lessons Learned is to increase individuals’ capabilities and the organization’s performance toward engagement in situations where what works is not certain. Otherwise, there is a failure of lessons learned. 

References 

1. van Stralen D. Pragmatic High-Reliability Organization (HRO) During Pandemic COVID-19. Neonatology Today. 2020;15(4):3-9. 

2. van Stralen D, Mercer TA. High Reliability Organizing (HRO) is the Extension of Neonatology during Pandemic COVID-19. Neonatology Today. 2021;16(5):97-109. doi: 10.51362/neonatology.today/2021516597109

3. Army US. Establishing a Lessons Learned Program: Observations, Insights, and Lessons. Fort Leavenworth, KS: Center for Army Lessons Learned; 2011. 

4. Whitehead AN. Science and the modern world. New York, NY: The Macmillan Company; 1925. 

5. James W. Abstractionism and ‘Relativismus. The Meaning of Truth. New York, NY: Longman Green and Co 1911. p. 246-71. 

6. Tetlock PE. Expert Political Judgment: How Good Is It? How Can We Know? . 2nd ed. Princeton, NJ: Princeton University Press; 2017 2005. 

7. Brown JS, Laundre JW, Gurung M. The Ecology of Fear: Optimal Foraging, Game Theory, and Trophic Interactions. Journal of Mammalogy. 1999;80(2):385-99. 

8. van Stralen D, Mercer TA. Pandemic COVID-19, the High- Reliability Organization (HRO), and the Ecology of Fear. Neonatology Today. 2020;15(12):129-38. doi: 10.51362/neonatology.today/2020121512129138

9. Halley JM. Ecology, evolution and 1f-noise. Trends in ecology & evolution. 1996;11(1):33-7. 

10. van Stralen D, McKay SD, Mercer TA. Disaster Series: High Reliability Organizing for (HRO) Disasters–Disaster Ecology and the Color of Noise. Neonatology Today. 2021;16(12):96-109. doi: https://doi.org/10.51362/neonatology.today/2021161296108

11. Boettiger C. From noise to knowledge: how randomness generates novel phenomena and reveals information. Ecol Lett. 2018;21(8):1255-67. Epub 2018/05/24. doi: 10.1111/ele.13085. PubMed PMID: 29790295. 

12. van Stralen D, Mercer TA. High-Reliability Organizing (HRO) in the COVID-19 Liminal Zone: Characteristics of Workers and Local Leaders. Neonatology Today. 2021;16(4):90-101. doi: 10.51362/neonatology.today/2021416490101

13. Weick KE. The collapse of sensemaking in organizations: The Mann Gulch disaster. Administrative science quarterly. 1993;38(4):628-52. 

14. van Stralen D, McKay SD, Mercer TA. Identifying Gaps – Entering the Path to High Reliability Organizing (HRO). Neonatology Today. 2022;17(8):29-42. 

15. Arnold III AV. Strategic visioning: What it is and how it’s done. Carlisle Barracks, PA: United States Army War College, 1991. 

Magee RR. Strategic leadership primer. Carlisle Barracks, PA: United States Army War College, 1998. 

17. Szakolczai A. Liminality and experience: Structuring transitory situations and transformative events. International Political Anthropology. 2009;2(1):141-72. 

18. Fitzpatrick JS. Adapting to danger: A participant observation study of an underground mine. Sociology of Work and Occupations. 1980;7(2):131-58. 

19. van Stralen D, Byrum S, Inozu B. High Reliability for a Highly Unreliable World: Preparing for Code Blue through Daily Operations in Healthcare. North Charleston, SC: CreatSpace Publishing; 2017. 

20. van Stralen D, McKay SD, Williams GT, Mercer TA. Tactical Improvisation: After-Action/ Comprehensive Analysis of the Active Shooter Incident Response by the San Bernardino City Fire Department December 2, 2015. San Bernardino, CA: San Bernardino County Fire Protection District; 2018. 

21. McKay S, James J, Greg S, Dominick B, Bryan H, Ditzel R, et al. Refining Operational Vertical Mobility. Journal of High Threat & Austere Medicine. 2021:19. doi: 10.33553/jhtam.v3i1.33

22. van Stralen D, Mercer TA. The Nature of Neonatal Experience during Pandemic COVID-19. Neonatology Today. 2021;16(3):87-97. doi: 10.51362/neonatology.today/202131638797. 

23. Tempest S, Starkey K, Ennew C. In the Death Zone: A study of limits in the 1996 Mount Everest disaster. Human Relations. 2007;60(7):1039-64. 

24. van Stralen D, McKay SD, Mercer TA. Pragmatic Leadership Practices in Dangerous Contexts: High-Reliability Organizing (HRO) for Pandemic COVID-19. Neonatology Today. 2020;15(8):109-17. 

25. van Stralen D, McKay SD, Mercer TA. Flight Decks and Isolettes: High-Reliability Organizing (HRO) as Pragmatic Leadership Principles during Pandemic COVID-19. Neonatology Today. 2020;15(7):113-22. 

26. Legg C, Hookway C. Pragmatism. In: Zalta EN, editor. The Stanford Encyclopedia of Philosophy: Stanford University Press; 2020. 

27. Star SL. Living grounded theory: Cognitive and emotional forms of pragmatism. In: Bryant A, Charmaz K, editors. The Sage handbook of grounded theory. Los Angeles, CA: Sage Publications; 2007. p. 75-94. 

28. Pólya G. Induction and analogy in mathematics. Princeton, NJ: Princeton University Press; 1954. 

29. Hickman LA. Pragmatism as post-postmodernism: Lessons from John Dewey. New York, NY: Fordham University Press; 2007. 

30. Rasmussen J. What Can Be Learned from Human Error Reports? In: Duncan KD, Gruneberg MM, Wallis D, editors. Changes in Working Life. New York, NY: Wiley; 1980. p. 97–113. 

31. Shannon CE. A Mathematical Theory of Communication. Bell System Technical Journal. 1948;27(3):379-423. doi: 10.1002/j.1538-7305.1948.tb01338.x. 

32. van Stralen D. Ambiguity. Journal of Contingencies and Crisis Management. 2015;23(2):47-53. doi: 10.1111/1468- 5973.12082. 

33. van Stralen D, McKay SD, Inaba K, Hartwig M, Mercer TA. Comment on “A Tactical Medicine After-action Report of the San Bernardino Terrorist Incident”. West J Emerg Med. 2018;19(5):825-6. Epub 2018/09/12. doi: 10.5811/westjem.2018.6.39216. PubMed PMID: 30202494; PubMed Central PMCID: PMCPMC6123096 are required to disclose all affiliations, funding sources and financial or management relationships that could be perceived as potential sources of bias. No author has professional or financial relationships with any companies that are relevant to this study. There are no conflicts of interest or sources of funding to declare. 

34. Lieber R. Morphology and lexical semantics. Cambridge, UK: Cambridge University Press; 2004. 

35. Pauly D. Anecdotes and the shifting baseline syndrome of fisheries. Trends in Ecology & Evolution. 1995;10(10):430. 

36. Wolfberg A. Full-spectrum analysis: A new way of thinking for a new world. Military Review. 2006;86(4):35-42. 

37. van Stralen D, Mercer TA. High-Reliability Organizing (HRO), Decision Making, the OODA Loop, and COVID-19. Neonatology Today. 2021;16(8):86-96. 

38. van Stralen D, Mercer TA. Common Sense High Reliability Organizing (HRO) in the Response to COVID-19. Neonatology Today. 2021;16(7):90-102. doi: 10.51362/neonatology. today/2021716790102. 

39. Heggie V. Experimental physiology, Everest and oxygen: from the ghastly kitchens to the gasping lung. The British Journal for the History of Science. 2013;46(1):123-47. doi: 10.1017/s0007087412000775

40. Weick KE, Sutcliffe KM. Managing the Unexpected: Assuring High Performance in an Age of Complexity. Quinn RE, editor. San Francisco, CA: Jossey-Bass; 2001. 

41. van Stralen D, Mercer T. The Art of Neonatology, the Art of High Reliability as a Response to COVID-19. Neonatology Today. 2021;16(2):74-83. doi: 10.51362/neonatology. today/202121627483. 

42. van Stralen D. Pragmatic HRO during Pandemic COVID-19. Neonatology Today. 2020;15(4):3-9. 

43. Weick KE. Organizing for transient reliability: The production of dynamic non-events. Journal of contingencies and crisis management. 2011;19(1):21-7. 

44. Zundel M, Kokkalis P. Theorizing as engaged practice. Organization Studies. 2010;31(9-10):1209-27. 

45. Star SL. This is Not a Boundary Object: Reflections on the Origin of a Concept. Science, Technology, & Human Values. 2010;35(5):601-17. doi: 10.1177/0162243910377624

46. Weick KE. Arrested Sensemaking: Typified Suppositions Sink the El Faro. in press. 2022. 

47. Whitehead AN. Science and the modern world. Cambridge, UK: Cambridge University Press; 1926. 

48. van Stralen D. High-reliability organizations: Changing the culture of care in two medical units. Design Issues. 2008;24(1):78-90. 

49. van Stralen DW, Calderon RM, Lewis JF, Roberts KH. Changing a pediatric sub-acute facility to increase safety and reliability. Patient Safety and Health Care Management. 7: Emerald Group Publishing Limited; 2008. p. 259-82. 

50. van Stralen D, Mercer TA. Ambiguity in the Operator’s Sense. Journal of Contingencies and Crisis Management. 2015;23(2):54-8. 

51. Heuer RJ, Jr. Psychology of Intelligence Analysis. Langley, VA: Center for the Study of Intelligence, CIA; 1999. 

52. Boyd J. Destruction and creation. Fort Leavenworth, KS: US Army Comand and General Staff College, 1976. 

53. Tetlock PE. Expert Political Judgment. Princeton, NJ: Princeton University Press; 2009. 

54. van Stralen D, McKay SD, Mercer TA. Flight Decks and Isolettes: High-Reliability Organizing (HRO) as Pragmatic Leadership Principles during Pandemic COVID-19. Neonatology Today. 2020;15(7):113-21. doi: 10.51362/neonatology.today/20207157113121

55. MacDougall-Shackleton SA. The levels of analysis revisited. Philos Trans R Soc Lond B Biol Sci. 2011;366(1574):2076- 85. Epub 2011/06/22. doi: 10.1098/rstb.2010.0363. PubMed PMID: 21690126; PubMed Central PMCID: PMCPMC3130367. 

56. Garson J. Modal Logic. In: Zalta EN, editor. The Stanford Encyclopedia of Philosophy: Metaphysics Research Lab, Stanford University; 2021. 

57. van Stralen D, McKay SD, Mercer TA. Operational Logics and Inference During [1/f or f -1] Noise Events: High-Reliability Operations (HRO). Neonatology Today. 2022;17(3):18-31. 

58. Douglas KM, Uscinski JE, Sutton RM, Cichocka A, Nefes T, Ang CS, et al. Understanding conspiracy theories. Political Psychology. 2019;40:3-35. doi: 10.1111/pops.12568

59. Kunda Z. The case for motivated reasoning. Psychological bulletin. 1990;108(3):480. 

60. van Stralen D, McKay SD, Hart CA, Mercer TA. Implementation of High-Reliability Organizing (HRO): The Inherent Vice Characteristics of Stress, Fear, and Threat. Neonatology Today. 2022;17(6):26-38. 

61. Bracha HS. Human brain evolution and the “Neuroevolutionary Time-depth Principle:” Implications for the Reclassification of fear-circuitry-related traits in DSM-V and for studying resilience to warzone-related posttraumatic stress disorder. Progress in Neuro-Psychopharmacology and Biological Psychiatry. 2006;30(5):827-53. 

62. van Stralen D, Mercer TA. High Altitude Climbing, High Reliability, COVID-19, and the Power of Observation. Neonatology Today. 2021;16(1):68-79. doi: 10.51362/neonatology.today/20211616879

63. Shipton E. The Expedition to Cho Oyu. The Geographical Journal 1953;119(2):129-37. 

64. Bruce CG. The Assault on Mount Everest, 1922 London, UK: E. Arnold & Company; 1923. 

65. Tilman HW. Everest 1938: Whether these mountains are climbed or not, smaller expeditions are a step in the right direction. Cambridge, UK: Cambridge University Press; 1948. 

66. Norton EF. The Fight for Everest 1924: Mallory, Irvine and the Quest for Everest. Sheffield, South Yorkshire: Vertebrate Publishing; 2015. 

67. Howard-Bury CK. Mount Everest: The reconnaissance, 1921. London, UK: Edward Arnold & Co. ; 1922. 

68. Douglas CG, Haldane JS, Yandell Henderson, Schneider EC. VI. Physiological observations made on Pike’s Peak, Colorado, with special reference to adaptation to low barometric pressures. Philosophical Transactions of the Royal Society of London Series B, Containing Papers of a Biological Character. 1913;203(294-392):185-318. 

69. West JB. George I. Finch and his pioneering use of oxygen for climbing at extreme altitudes. Journal of Applied Physiology 2003;94(5):1702-13. 

70. Zundel M, Kokkalis P. Theorizing as engaged practice. Organization Studies 2010;31(9-10):1209-27. 

71. Weick KE. Organizing for Transient Reliability: The Production of Dynamic Non-Events. Journal of Contingencies and Crisis Management. 2011;19(1):21-7. doi: 10.1111/j.1468-5973.2010.00627.x

72. Heggie V. Higher and colder: The success and failure of boundaries in high altitude and Antarctic research stations. Soc Stud Sci. 2016;46(6):809-32. Epub 2016/12/28. doi: 10.1177/0306312716636249. PubMed PMID: 28025914; PubMed Central PMCID: PMCPMC5207293. 

Disclosures: No author has professional or financial relationships with any companies that are relevant to this study. There are no conflicts of interest or sources of funding to declare. 

Corresponding Author
Daved van Stralen, MD, FAAP

Daved van Stralen, MD, FAAP 
Associate Professor, Pediatrics 
Department of Pediatrics 
Loma Linda University School of Medicine 
11175 Campus Street 
CP-A1121 
Loma Linda, CA 92350 
Email: DVanStra@llu.edu 

Sean McKay 
Executive Partner / Director, Disruptive Rescue & Austere Medicine 

Sean McKay 
Executive Partner / Director, Disruptive Rescue & Austere Medicine 
Element Rescue – Response Solutions within Nonlinear Complex Environments 
Greenville, South Carolina, United States 

Thomas A. Mercer 
Rear Admiral 
United States Navy (Retired) 

Thomas A. Mercer 
Rear Admiral 
United States Navy (Retired) 

Acknowledgments
  • Karl Weick, Rensis Likert Distinguished University Professor of Organizational Behavior and Psychology, Emeritus, University of Michigan 
  • Errol van Stralen, Ancora Education
  • William J. Corr, formerly with the Los Angeles City Fire Department (retired) 
  • Dan Kleinman, Operations Section Chief, National Incident Management Organization (retired)
  • Ronald D. Stewart, Professor, Emergency Medical Services, Dalhousie University, Nova Scotia, Canada 
nt-22-11-020-034Download