Daved van Stralen, MD, FAAP, William Gambino, MA, MMAS
An organization’s High-Reliability Organization (HRO) attributes can become impediments to generating reliability and safety in ill-structured, dangerous, or life-preservation contexts. To what extent do subjective perceptions of what constitutes “reliability” develop into attack vectors for self-inflicted organizational sabotage? How could internal administrative or external regulatory pressure cause an otherwise reliable organization to focus solely on the strongest failure signals, oversimplify circumstances, centralize decision making authority, vilify error, and disregard outliers? “Preoccupation with failure” becomes “preoccupation with error.” Error, then, loses its leverage for learning and understanding. The resulting fear of, or preoccupation with, error becomes an obstacle to comprehension, learning, and enactment.
We agree with Dr. Turbow (Turbow 2020) that HRO has the potential to improve outcomes (Roberts, Kuo, and van Stralen 2004; Roberts et al. 2005; van Stralen 2008; van Stralen et al. 2008) and engage medical errors beyond what medical experts envisioned for HRO (Nolan et al. 2004; Hines et al. 2008; Chassin and Loeb 2013; Department of Defense 2014). Unfortunately, the incomplete translation from HRO theory into HRO practice (van Stralen 2020) includes misinterpretation of errors as failure signals and the mistranslation of error and failure as exposure to liability.
Pronounced, almost singular, focus on error and liability may appear prudent but misdirects HRO processes, sacrificing responsiveness and adaptability for standardization and compliance with rules and processes. The drive for organizational reliability and safety often results in normative, reductionist, and linear solutions.
One person’s error is another person’s information. Until we experience something, we don’t know exactly how we would act or what to expect. Acts are not mistakes; they become mistaken late in their development (Paget 1998, 45). The future branches in time (Goranko and Galton 2015), open to the influence of future contingents the individual may not anticipate. Until something makes it visible, we do not notice the consequences, but by then, the antecedents have become lost to perception, and the actor departed. Organizations achieve High Reliability between the rules, but whether through errors captured by insiders or plans made by spectators, or both, may depend on how you classify actions, errors, and compliance.
We wonder how leadership would classify outliers, situations, actions, and outcomes that do not fit a category or meet a standard. Without classification, we lose sensitivity to exclusions, and the elements become invisible (Bowker and Star. 2000; 300-1), capable of an unexpected appearance and abrupt disruptions. The lack of clear definition and standardization along with outside influences (Grober and Bohnen 2005) makes medical error capable of misclassification or invisible from non-classification. To standardize is to invent error.
Classification systems have three characteristics, following abstract scientific logic: 1) they are consistent and unique, 2) the categories are mutually exclusive, and 3) the system is complete (Bowker and Star 2000; 10-11). Classification influences thinking and acting. Consider how “error” and the “typology of error” influence whether the team engages a disruption as an error, a novel situation, or emergent circumstances. How one classifies the incident influences actions, communications, and documentation.
But concrete experience and immediate perceptions have degrees of truth and different ways of being true, following modal logics of degree, necessity, and possibility (Garson 2016). We must not allow abstractions of error and liability to substitute for concrete, value-free rendering of fact. Alfred North Whitehead (1926/1967; 64) warns against the “fallacy of misplaced concreteness,” of accepting abstractions as the most concrete rendering of fact. Error as an abstraction interferes with leveraging error by an individual to make a mistake and, with colleagues, discover what steps to take without suffering other consequences (Rosson and Carroll 2005; 87-8).
Classifying elements makes visible (and surveillance easier) what the dominant domain considers important (Bowker and Star 2000; 30, 44-46), for example, the International Classification of Diseases (ICD-10) classification system and criteria for diagnosis. A premature infant can be classified by diagnoses (limited to ICD-10), physiological derangements (hypoxemia, peripheral perfusion), nursing care (intravenous infusions, medication administration, feeding methods), or technology demands (mechanical ventilation, ECMO). What is important differs from whether we look from top-down versus bottom-up or from the isolette versus the administration.
Problems interact with the environment creating abrupt, disorienting changes. With rules created independently of context and insistence on compliance, subordinates look for evidence supporting discrete rules rather than generating information to resolve the problem. Rules confound the application of discrete concepts to continuously evolving events. Compliance, in the absence of an identifiable rule or applicable process, inhibits initiative and prevents the experiences of failing from which we learn (van Stralen, McKay, Mercer 2020). Because we cannot identify failures from not acting, we cannot correct such errors, and belief in the value of “not acting” becomes incorporated into cultural knowledge (Weick 1979 148). A pattern of presumed successes then forms gives the illusion of legitimacy, halts learning, and reinforces belief in the value and importance of compliance.
But to what or whom are we compliant? Environments where people must move between ill-structured and well-structured problems confound people anchored in the normative stance. Standardization, enforced by error management, creates the perception of control, thereby reducing this discomfort. As a measurement of not reaching the standard, error does have important functions in education, documentation, recovery of information, or creating common ground between diverse domains, communities of practice, and regulatory agencies (Star and Griesemer 1989; Bowker and Star 2000; 15-16). But standardization’s significant inertia to resist change (Bowker and Star 2000; 325), supported by error management, makes standardization an effective mechanism to “control the tacking back-and-forth, and especially, to standardize and make equivalent the ill-structured and well-structured aspects” (Star 2010). Standardization through error management, effectively normalizes behaviors, reinforces compliance, and inhibits action outside of organizational norms. Compliance for the purpose of error reduction becomes normative behavior.
From the top-down normative stance, compliance, readily measurable against idealized standards (Bowker and Star 2000; 15), makes more sense. The inertia of standardization overcomes the HRO characteristic “reluctance to simplify.” Complexification and agility, now circumscribed, can no longer support problem resolution and achievement of an accepted end-state. The pragmatic stance has become an organizational outlier.
From the bottom-up pragmatic stance, error emerges from local, nonlinear interactions, manifesting the environment entwined with human intent. Invisible processes complicate interventions. During contingent circumstances, error avoidance occupies working memory, and, when people most need thought, performance decreases. Error correction, on the other hand, drives engagement, extending operations into adverse conditions and hostile environments.
Elaboration of compliance in this manner reveals the negative side of compliance – detrimental outside interference, error, safety, and liability become failure signals, and security becomes compliance-based. Perhaps we can obtain some understanding when we view error and compliance through the domains of law, business, and sabotage.
For questions of HRO and the law, we have deferred to the late Assistant US Attorney Michael “Mike” A. Johns, who advised us that the decision-making elements of HRO could offer protection from legal action, particularly through understanding and use of heuristics and the consequent biases. For the HRO, error corrects heuristic bias (van Stralen, McKay, and Mercer 2020). As in our opening paragraph, Johns was concerned “whether influences from outside the agency itself might be contributing to decision errors (their attorneys, their court system, etc.)” (personal communication).
HRO, through preoccupation with failure, makes visible safety lapses and the breach of duty in liability. However, acting as outside influences, liability, and safety contribute to decision errors. C. Northcote Parkinson (1955), observed that his eponymic Parkinson’s Law, “work expands so as to fill the time available for its completion,” contributed to economic inefficiency during WWII. Any criticism would likely be met with, “Don’t you know there’s a war on?” (Stevenson 1993). For example, healthcare executives, resistant to a patient safety study out of concern for liability to the hospital, queried a committee about liability. One member asked, “What duty are we breaching?” The executives could not articulate any duty the study would breach. Queries about liability and safety will easily terminate or endanger the extension of operations into ambiguity, adversity, or threat. The hospital did not conduct the aforementioned study.
Physical protection systems (PPS) protect nuclear facilities (high consequence low probability events/incidents) against theft or sabotage. Performance criteria select elements and procedures for overall system performance while feature criteria (also called compliance-based) select elements for the presence of certain items (Garcia 2007; 64-5). “The use of a feature criteria approach in regulations or requirements that apply to a PPS should generally be avoided or handled with extreme care. Unless such care is exercised, the feature criteria approach can lead to use of a checklist method to determine system adequacy, based on the presence or absence of required features. This is clearly not desirable, since overall system performance is of interest, rather than the mere presence or absence of system features or components” (Garcia 2007; 8).
Preoccupation with failure logically leads to error as a failure signal and liability exposure as a potential failure. Behaviors to prevent failure or reduce liability exposure include “doing everything through channels,” “refer all matters to committees” which should be “as large as possible — never less than five,” “advocate caution,” “urge your fellow-conferees to be reasonable and avoid haste,” “worry about the propriety of any decision — raise the question of whether such action as is contemplated lies within the jurisdiction of the group or whether it might conflict with the policy of some higher echelon,” and “apply all regulations to the last letter.” The above quotations exemplify a “type of simple sabotage’ that requires “no destructive tools whatsoever and produces physical damage, if any, by highly indirect means.” During World War II, the United States Office of Strategic Services (OSS) contributed to undermining Nazi industrial efforts by teaching these “simple sabotage” methods to civilian workers in occupied Europe (Office of Strategic Services 1944). Yet, leaders commonly accept these methods as a prudent means to prevent error and reduce liability.
Could an otherwise reliable organization be targeted via its own HRO attributes? That is, error, liability, and safety, as singular failure signals distracting support from line staff, sabotages efforts to generate reliability and safety.
References:
- Bowker, Geoffrey C., and Susan Leigh Star. 2000. Sorting things out: Classification and its consequences. Cambridge, MA: MIT Press.
- Chassin, Mark R., and Jerod M. Loeb. 2013. “High-reliability health care: getting there from here.” The Milbank Quarterly 91(3): 459-90.
- Department of Defense. 2014. Department of Defense Briefing by Secretary Hagel and Deputy Secretary Work on the Military Health System Review in the Pentagon Press Briefing Room. https://www.defense.gov/Newsroom/Transcripts/ Transcript/Article/606937/ retrieved 04/06/2020.
- Garcia, Mary Lynn. 2007. Design and evaluation of physical protection systems. Burlington, MA: Elsevier.
- Garson, James. 2016. “Modal Logic.” In The Stanford Encyclopedia of Philosophy (Spring 2016 edition), edited by Edward N. Zalta. https://plato.stanford.edu/archives/spr2016/ entries/logic-modal/ .
- Goranko, Valentin, and Antony Galton. 2015. “Temporal Logic.” In The Stanford Encyclopedia of Philosophy (Winter 2015 edition), edited by Edward N. Zalta. https://plato.stanford. edu/archives/win2015/entries/logic-temporal/ .
- Grober, Ethan D., and John M. A. Bohnen. 2005. “Defining medical error.” Canadian Journal of Surgery 48(1): 39-44.
- Hines, Stephen, Katie Luna, Jennifer Lofthus, Michael Marquardt, and Dana Stelmokas. 2008. Becoming a High Reliability Organization: Operational Advice for Hospital Leaders. AHRQ Publication No. 08-0022. Rockville, MD: Agency for Healthcare Research and Quality.
- Nolan, Thomas, Roger Resar, Frances Griffin, and Ann B. Gordon. 2004. Improving the reliability of health care. Boston, MA: Institute for Healthcare Improvement.
- Northcote Parkinson, C. 1955. Parkinson’s law. The Economist. Nov 19, 1955. London, England: The Economist Group.
- Office of Strategic Services. 1944. “Simple Sabotage Field Manual — Strategic Services (Provisional).” Strategic Field Manual No 3. Washington, DC: Office of Strategic Services.
- Paget, Marianne A. 1988. The unity of mistakes: A phenomenological interpretation of medical work. Philadelphia, PA: Temple University Press.
- Roberts, Karlene. H., Peter M. Madsen, Vinit Desai, and Daved Van Stralen. 2005. “A case of the birth and death of a high reliability healthcare organisation.” BMJ Quality & Safety 14(3): 216-20.
- Roberts, Karlene H., Kuo Yu, and Daved Van Stralen. 2004. “Patient safety is an organizational systems issue: Lessons from a variety of industries. “Patient safety handbook 2nd edition, 169-86.
- Rosson, Mary Beth, and John M. Carroll. 2005. “Minimalist design for informal learning in community computing.” In Communities and Technologies: Proceedings of the Second Communities and Technologies Conference, Milano 2005, edited by Peter van den Besselaar, Giorgio de Michelis, Jenny Preece, and Carla Simone, 75-94. Dordrecht, The Netherlands: Springer Science & Business Media.
- Star, Susan Leigh. 2010. “This is not a boundary object: Reflections on the origin of a concept.” Science, Technology, & Human Values 35(5): 601-17.
- Star, Susan Leigh and James Griesemer. 1989. “Institutional Ecology, ‘Translations,’ and Boundary Objects: Amateurs and Professionals in Berkeley’s Museum of Vertebrate Zoology, 1907–1939.” Social Studies of Science, 19(3): 387–420.
- Stevenson, Richard W. 1993. C. Northcote Parkinson, 83, Dies; Writer With a Wry View of Labor. New York Times March 12, 1993 Section A, page 19.
- Tinbergen, Niko. 1963. “On Aims and Methods of Ethology.” Zeitschrift fur Tierpsychologie [Journal of comparative ethology] 20 (4): 410–33.
- Turbow, Robert. 2020. Medical Legal Forum: High Reliability in Health Care – An Introduction. Neonatology Today. Neonatology Today 15(7): 110-11.
- van Stralen, Daved. 2008. “High-reliability organizations: Changing the culture of care in two medical units.” Design Issues 24(1): 78-90.
- van Stralen, Daved. 2020. Pragmatic High-Reliability Organization (HRO) during Pandemic COVID-19. Neonatology Today 15(4): 3-9.
- van Stralen, Daved W., Racquel M. Calderon, Jeff F. Lewis, and Karlene H. Roberts. 2008. “Changing a pediatric sub-acute facility to increase safety and reliability.” Advances in Health Care Management 7: 259-82.
- van Stralen, Daved, Sean D. McKay, Thomas A. Mercer. 2020. Flight Decks and Isolettes: High-Reliability Organizating (HRO) and Pragmatic Leadership Principles during Pandemic COVID-19. Neonatology Today 15(7): 113-22.
- Weick, Karl E. 1979. The Social Psychology of Organizing (2nd ed.). New York: McGraw-Hill.
- Whitehead, Alfred North. 1926 [1967]. Science and the Modern World, (Lowell Institute Lectures 1925), Cambridge: Cambridge University Press. Reprinted New York: The Free Press.
Disclosure: The authors have no disclosures.