Human Factors in Error-Proofing Healthcare


Zoran Bojic, Team Leader, Quality Programs, The Hospital for Sick Children, Toronto, ON, Canada


Human factors, Error-proofing, Cognitive biases






The work environment of many healthcare organizations is characterized by a rapid pace of change in medical technology, complex systems and processes, variable and unpredictable process inputs, tight operating budgets, staff shortages, and increasing pressures to reduce costs while working to improve patient safety and the quality of care. Increasing complexity of patient care, the inevitable problem of human fallibility, and large volume of information arising from medical research present unique challenges to all stakeholders involved and further add to the complexity of healthcare system. Growing attention to severe consequences resulting from errors made by healthcare professionals was triggered in 2000 by the Institute of Medicine (IOM) report To Err is Human: Building a Safer Health System. Based on the two large studies and extrapolation to over 33.6 million admissions to the hospitals in the United States the IOM report concluded that between 44,000 and 98,000 Americans die each year as a result of medical errors.

Due to the pervasiveness and often disastrous consequences of human errors in complex, integrated systems, the study of human error has become an important and increasingly common research topic. Much of the practical knowledge that has been accumulated on human error is derived from industries capable of producing catastrophic events such as enormous loss of human life, explosion, fire, and release of toxic chemicals. In contrast, human errors occurring within the healthcare system are numerous, serious and insidious, however; they usually do not culminate in extensive news coverage.

Theoretical and methodological developments within the field of cognitive psychology have made it possible to better understand mental processes, explain some of the predictable manifestations of human fallibility, and develop effective strategies to eliminate or reduce human errors. Over the past few decades researchers have made significant advances toward understanding human performance and the causes of accidents in complex systems. The scientific discipline of Human Factors Engineering has evolved as a unique and independent discipline focused on safe design of products, systems, processes, tasks, jobs, and work environments while taking into consideration the needs, abilities and limitations of people.

Learning Objectives

Knowledge and skills acquired in this interactive presentation will help participants understand the perceptual and cognitive mechanisms underlying human errors, identify complex interactions between humans and technology, anticipate and recognize patient safety risks, effectively apply error-proofing methods, and design robust systems and processes that are resilient to human errors. The major concepts that will be presented include the following: 1. Human information processing models and classification of human errors; 2. Fundamental sensory and cognitive characteristics of humans that make them vulnerable to errors while working in complex socio-technical systems; 3. Impact of heuristics and cognitive biases on decision making and effects of internal and external performance shaping factors on human performance; 4. Person-centered approach and systems approach to the analysis and management of human errors; 5. Debiasing strategies, error-proofing methods and process design principles used in developing reliable, efficient, resilient, and human-centered systems; 6. Practical applications of high reliability concepts in hospitals; 7. Risks and benefits of automation and multifaceted relationship between automation and human performance; and 8. Synergies between Human Factors Engineering and Lean Six Sigma.