Richard Cook (safety researcher)
Richard I. Cook | |
---|---|
Born | May 3, 1953 |
Died | August 31, 2022 |
Nationality | American |
Alma mater | Lawrence University University of Cincinnati |
Known for | new look of safety, going solid, going sour, patient safety, how complex systems fail, line of representation, bone as archetype of resilience, being bumpable, sharp-end, first vs. second stories |
Scientific career | |
Fields | cognitive systems engineering resilience engineering |
Institutions | Control Data Corporation The Ohio State University University of Chicago KTH Royal Institute of Technology Adaptive Capacity Labs |
Dr. Richard I. Cook (May 3, 1953 – August 31, 2022)[1] was a system safety researcher, physician, anesthesiologist, university professor, and software engineer.[2] Cook did research in safety, incident analysis, cognitive systems engineering, and resilience engineering across a number of fields, including critical care medicine, aviation, air traffic control, space operations, semiconductor manufacturing, and software services.
Biography
[edit]Cook graduated Cum Laude from Lawrence University in 1975 from a customized program that included physics and urban planning. After completing his bachelor's degree, Cook took a position as a lead systems analysis at Control Data Corporation, working with finite element analysis programs such as ANSYS and NASTRAN on the CDC STAR-100, and managing teams of programmers and support analysts.[3]
In 1986, Cook received his MD degree from the University of Cincinnati where he was a General Surgery intern.[4] In 1994, he completed his Anesthesiology residence at the Ohio State University.[4]
Cook served in the Department of Anesthesia and Critical Care at the University of Chicago as an associate professor[5] and Director of the Cognitive Technologies Laboratory[6] from 1994 to 2012, where he provided clinical care, did teaching and training, research, and community service.[3]
In 2012, Cook was named Sweden's First Professor of Patient Safety, at KTH Royal Institute of Technology,[7] where he served until 2015, when he retired from the position as a Professor Emeritus.[3]
From 2015 to 2020, Cook worked as a research scientist at the Ohio State University in the Department of Integrated Systems Engineering. During this time, he also had a part-time appointment as a clinical professor of anesthesiology at the Ohio State University Wexner Medical Center, where he provided patient care and trained new medical practitioners.[3]
In 2017, Cook, along with John Allspaw and David Woods, founded a consulting company, Adaptive Capacity Labs.[2][3]
Patient safety work
[edit]Cook was active in the patient safety movement from the mid-1990s to the mid-2000s.[2][8][9] He was one of the founding board members of the National Patient Safety Foundation and served on its executive committee until 2007.[4] From 1998 to 2000, he advised the U.S. Veterans Health Administration (VA) on patient safety initiatives, and in 2000 was appointed co-director of a V.A. "Gaps Center" that was funded due to Cook's research.[2]
In 1997, Cook helped organize the workshop "Assembling the Scientific Basis for Progress on Patient Safety" in Chicago, and co-authored the resulting report published the following year: A Tale of Two Stories: Contrasting Views of Patient Safety.[10]
In 2011, Cook served on an advisory panel for the Institute of Medicine on the topic of Health IT and Patient Safety. In the final published report, he wrote a dissent where he argued that health IT software should be regulated as a class III medical device.[11]
How complex systems fail
[edit]In 1998, Cook wrote a treatise titled How Complex Systems Fail, republished in the book Web Operations: Keeping the Data on Time,[12] and Hindsight magazine,[13] where he identified eighteen characteristics of complex system failure modes.
In 2012, Cook gave a talk on the topic at the O’Reilly Velocity conference.[14]
New look
[edit]Cook was a proponent of what came to be known as the "new look" of safety[15][9][10] (referred to by Sidney Dekker as the "new view"[16]).
According to the New Look, operators within safety-critical are faced with competing demands, dilemmas, conflicts (both technical and organizational), and uncertainty. In particular, operators are always faced with the competing demands for achieving production goals and for failure-free operations.
When accidents occur, they tend to be attributed to human error because of hindsight bias. Interventions in the wake of accidents lead to a cycle of error, where they increase the complexity of the system and create the potential for new failure modes.
Instead of focusing on the people in the system as the source of accidents, the "new look" perspective argues that it is the people in the system that create the safety in the system; that it is the work of the human operators that compensate for gaps in the designed system, and that successful work is much more common than failure.[15]
"Going sour"
[edit]Cook (along with David Woods and John McDonald) introduced the term going sour to refer to incidents where there is a slow degradation of system performance over time.[17] Cook noted going sour incidents are more complex and more difficult to describe than acute incidents. In addition, in these types of incidents, the actions of human operators play more of a role in how the incident unfolds.
"Going solid"
[edit]Cook (along with Jens Rasmussen) introduced the term going solid to describe a significant shift in systems operations when a form of capacity becomes exhausted.[18] The term originates from the nuclear power industry, where it is used as slang to refer to a technical situation that has become difficult to manage. More literally, the term describes a change in system behavior related to the state of a steam boiler. Typically, a steam boiler contains a mixture of steam and liquid water. When the boiler becomes completely filled with liquid water, it is said to "go solid".
Cook applied the concept of "going solid" to an intensive care unit in a hospital that undergoes a "bed crunch", when there are no longer enough beds to assign to patients. Cook notes that "going solid" situations tend to foster opportunities for accidents to occur.
Line of representation
[edit]Cook noted that operators of software systems are not able to interact directly with the systems that they supervise, but instead they interact through representations.[19] Operators see visual representations of the internal state of software systems, and they manipulate representations in order to act on the system.
Cook used the term line of representation as a metaphor for distinguishing between two sets of entities. Above the line of representation lie people, organizations, and human processes. Below the line of representation are the software artifacts and infrastructure.
Cook notes that the people within an organization, who exist above the line, will describe in concrete terms the entities below the line, despite not being able to directly observe or act upon these entities.
Selected publications
[edit]- 1994 — Cook, R.I., Woods, D.D., Operating at the Sharp End: The Complexity of Human Error, in: Human Error in Medicine, Bogner, M.S., ed., CRC Press, June 1994 (doi:10.1201/9780203751725)
- 1998 — Cook, R.I., How Complex Systems Fail, Cognitive Technologies Laboratory, University of Chicago, Revision D (00.04.21)
- 1998 — A Tale of Two Stories: Contrasting Views of Patient Safety, National Health Care Safety Council of the National Patient Safety Foundation at the AMA, 1998
- 1998 — Cook, R.I., Two years before the mast: learning how to learn about patient safety, In: Scheffler AL, Zipperer LA, eds. Proceedings of the Second Annenberg Conference on Enhancing Patient Safety and Reducing Errors in Healthcare. Rancho Mirage, CA: National Patient Safety Foundation, pp61-4.
- 1998 — Cook, R.I. Being Bumpable, Proceedings of the Fourth Conference on Naturalistic Decision Making, May 29–31, 1998, The Airlie Conference Center, Warrenton, VA
- 2000 — Cook R.I, Render M., Woods, D.D., Gaps in the continuity of care and progress on patient safety, BMJ Clinical Research 2000;320(7237):791-4.
- 2005 — Cook R.I, Rasmussen J., “Going solid”: a model of system dynamics and consequences for patient safety, BMJ Quality & Safety 2005;14:130-134.
- 2005 – Cook, R.I. A brief look at the New Look in complex system failure, error, safety, and resilience, Cognitive Technologies Laboratory, University of Chicago, Revision AA (05.11.07).
- 2006 — Cook, R.I., Woods, D.D., Distancing Through Differencing: An Obstacle to Organizational Learning Following Accidents, in: Resilience Engineering, Nov. 2006 (doi:10.1201/9781315605685-28)
- 2010 — Woods, D.D., Dekker, S., Cook, R.I., Johannesen, L., Sarter, N. Behind Human Error, 2nd Edition, CRC Press, 2010.
- 2020 — Cook, R.I., Above the Line, Below the Line, ACM Queue 2020; 17(6) (doi:10.1145/3380774.3380777)
Recorded talks
[edit]The Future of above-the-line Tooling, SRECon Americas 2022
A Few Observations on the Marvelous Resilience of Bone & Resilience Engineering, REdeploy Conference, San Francisco, CA, 2019
Resilience In Complex Adaptive Systems, O'Reilly Velocity Web Performance and Operations Conference, New York, NA, 2013
How Complex Systems Fail, O'Reilly Velocity Web Performance And Operations Conference, Santa Clara, CA, 2012
Lectures on the study of cognitive work, The Royal Institute of Technology, Huddinge, Sweden, 2012
References
[edit]- ^ "Richard I. Cook, MD, 1953-2022". Biological Sciences Division, University of Chicago. 2022-09-20. Archived from the original on 2022-09-26. Retrieved 2022-09-26.
- ^ a b c d Woods, David; Allspaw, John; O'Connor, Michael. "The Career, Accomplishments, and Impact of Richard I. Cook: A Life in Many Acts". Retrieved 2022-09-16.
- ^ a b c d e Cook, Richard I. "Richard I. Cook, MD (LinkedIn page)". Retrieved 2022-09-16.
- ^ a b c "ctlab - Richard Cook". 2013-05-02. Archived from the original on 2013-05-02. Retrieved 2022-09-22.
- ^ "Department of Anesthesia & Critical Care (DACC) - The University of Chicago". 2011-11-10. Archived from the original on 2011-11-10. Retrieved 2022-09-22.
- ^ "Cognitive Technologies Laboratory". 2011-09-12. Archived from the original on 2011-09-12. Retrieved 2022-09-22.
- ^ "KTH Names Sweden's First Professor of Patient Safety". KTH. Retrieved 2022-09-23.
- ^ Leape, Lucian L. (2021). Making Healthcare Safe: The Story of the Patient Safety Movement. Cham: Springer International Publishing. doi:10.1007/978-3-030-71123-8. ISBN 978-3-030-71122-1. S2CID 235220920.
- ^ a b Wears, Robert; Sutcliffe, Kathleen (2019-11-21). "Still Not Safe". doi:10.1093/oso/9780190271268.001.0001. ISBN 978-0-19-027126-8.
{{cite journal}}
: Cite journal requires|journal=
(help) - ^ a b "A Tale of Two Stories: Contrasting Views of Patient Safety".
{{cite journal}}
: Cite journal requires|journal=
(help) - ^ Technology., Institute of Medicine (États-Unis). Committee on Patient Safety and Health Information (2012). Health IT and patient safety building safer systems for better care. National Academies Press. OCLC 871959270.
- ^ Allspaw, John; Robbins, Jesse (2010). Web operations : keeping the data on time (1st ed.). Sebastopol, Calif.: O'Reilly Media. ISBN 978-1-4493-7746-5. OCLC 686711401.
- ^ Cook, Richard (December 2020). "How complex systems fail". Hindsight. 31.
- ^ Velocity 2012: Richard Cook, "How Complex Systems Fail", retrieved 2022-09-29
- ^ a b Cook, R. (2004). "A brief look at the New Look in complex system failure, error, safety, and resilience" (PDF). Cognitive Technologies Laboratory. Revision W (4.09.08). University of Chicago.
- ^ Dekker, Sidney W.A. (October 2002). "Reconstructing human contributions to accidents: the new view on error and performance". Journal of Safety Research. 33 (3): 371–385. doi:10.1016/s0022-4375(02)00032-4. ISSN 0022-4375. PMID 12404999. S2CID 46350729.
- ^ Cook, Richard I; Woods, David D; McDonald, John S. (1991). "Human Performance in Anesthesia Human Performance in Anesthesia Human Performance in Anesthesia". doi:10.13140/RG.2.2.29675.36648.
{{cite journal}}
: Cite journal requires|journal=
(help) - ^ Cook, R (2005-04-01). ""Going solid": a model of system dynamics and consequences for patient safety". Quality and Safety in Health Care. 14 (2): 130–134. doi:10.1136/qshc.2003.009530. ISSN 1475-3898. PMC 1743994. PMID 15805459.
- ^ "Above the Line, Below the Line - ACM Queue". queue.acm.org. doi:10.1145/3380774.3380777. S2CID 211041647. Retrieved 2022-10-02.