ARTICLE

Patient safety in a new age of understanding

Back to Index

Patient safety in a new age of understanding

P. James A. Ruiter, MD

 

Engaged health care teams are necessary for any improvement in quality or safety. But health care teams will not engage simply because we ask them to do so. Any improvements in health care will require leaders to understand the why of disengagement. In the first of this three-part series — Disengagement in health care: today’s new culture — we began to understand why only 57% of health care workers are engaged and why that number is decreasing annually. We now examine how health care’s approach to safety and quality has unwittingly been complicit in building this disengagement.

 

KEY WORDS: physician engagement, patient safety, quality improvement, resilient health care

 

It has been 20 years since the Institute of Medicine’s landmark report, To Err is Human,1 was presented to Congress. Is there anything to celebrate? Although pockets of excellence exist,2-4 the number of deaths resulting from preventable adverse events keeps rising.5,6 Over the last 20 years, it appears that we have been arguing over the numbers, while “action and progress on patient safety is frustratingly slow.”5 The overall harm rate has stalled at 10%.7 This lack of progress can, in and of itself, act as a disincentive to front-line teams. To understand why the impact of efforts has been minimal, we need to examine our recent past as it pertains to quality and safety.

 

The human as a liability

 

Humans are inattentive, make mistakes, and routinely break the rules. Classified loosely as cognition error, inattention tends to be caused by memory slips and lapses. An example is a misdialed telephone number. These types of error occur with great frequency in health care. In the past, such slips would be retrained away, but does re-teaching a telephone number really help prevent misdialing it in the future? Retraining satisfies societal corporate responsibility, but does nothing to reduce the likelihood of recurrence: no change in the rate of harm ensues.

 

From a human-centric perspective, both inattentiveness and mistakes have been well addressed by an approach known as “just culture.” Just culture postulates that with cognition-based errors, the worker is already punishing him- or herself; thus, the right approach is to console the colleague involved. With errors caused by mistakes, training is the appropriate approach with no punishment. However, knowledge alone will do little. It is its application in our interprofessional environment that is key to modifying the harm rate.

 

A final category of human error exists where just culture and more routine approaches to error might suggest punishment. This category can be classified as routine violations. Sabotage, and the intentional harming of a patient, must be stopped immediately and punished. However, deviance is often normalized and added to the routine violations column and — depending on how just culture is implemented or how the deviance is interpreted — may lead to unintended consequences because of the fear of punishment.

 

Normalization of deviance has been defined as: “People within the organization become so much accustomed to a deviant behaviour that they don’t consider it as deviant, despite the fact that they far exceed their own rules for elementary safety.”8

 

At face value, it is totally logical to punish all forms of deviance.

 

However:

 

  • What were the factors that encouraged the team to make the decisions they did?
  • Can every rule actually be followed in hospitals’ busy units?
  • Is every deviance bad?

 

As an extension of the last question: what would happen if positive deviance was punished because it was not recognized as such or tolerated? Positive deviance occurs naturally as humans make changes to complex processes, to find a better way of doing things.

 

So how common is deviance?

 

The reality of the matter is that deviance is ubiquitous. In any human endeavour, four types of work are described:

 

  • Work as imagined — individuals imagine what the work is, but in their role, they do not actually do the work
  • Works as prescribed — a work process is created, like a policy, procedure, or guideline
  • Work as done — what the work actually looks like, when translated and performed by those who do it
  • Work as disclosed — what those who do the work actually report the work to have been (heavily influenced by the presence or absence of psychological safety)9,10

 

Ideally, these types of work would all look the same. However, they rarely resemble each other. Why?

The answer becomes clear as we learn that there are over 600 policies, procedures, and guidelines in place at the average hospital to guide the front-line team in a day of work.11 There are simply too many rules for anyone to know about, remember, look up, follow, or keep up-to-date. In 2016, the half-life of medical knowledge was just 5.5 years; thus, keeping 600+ policies up-to-date is impossible.12 In 2015, Johns Hopkins reported that in one of its intensive care units only 40% of patients received “proper treatment” despite the use of bundles and checklists, in part because there was not enough time to apply them all.13

 

Work as done cannot resemble work as prescribed

 

The organization is under stress, the stress of production in an environment of economic and workload challenges.14 These challenges, and the need to continually “produce,” lead the organization to seek efficiency from its staff.15 The need for efficiency, which comes at the expense of thoroughness, amounts to the organization giving tacit approval to work as done even if it deviates from the rules.

 

When something goes wrong, the organization, in hindsight, realizes that the team should have been thorough in that particular instance. Thus, the worker can be held accountable for a rule that they could never have followed in the first place. The frequent judgement: harm occurred because the worker did not follow the rule. This is simply untrue; no one can say whether the incident (or another) would — or would not — have occurred if the rule had been followed in this instance, i.e., in the multidimensional, unpredictable, and complex environment that is health care today.

 

As a final consequence, two things are likely to follow as a result of the investigation: a new policy will be created to try to prevent recurrence, and the original rule will not be improved. Our teams get yet another policy to follow, workers remain disengaged, and no change in the rate of harm ensues.

 

In summary, equating deviance with rule breaking contributes to the silence surrounding the fact that work as done does not resemble work as prescribed. It can lead to multiple ways of doing things in the same case, which can be chaotic and increase harm. Staff disengages, and nothing changes.

 

Care is not linear

 

The early days of patient safety saw the development of linear foundational models to help us understand error causality.16 Policies (or protective solutions to error) evolved, but they inferred that care was linear and presumed average workload. This traditional thinking remains despite our new understanding of complexity. Care is in fact multidimensional, and there is rarely a day with an average workload. Top

 

Furthermore, if care is understood to be multidimensional and tightly coupled,14 then it becomes evident that the linearity of safety models has led us to produce linear protective solutions to sequences of events that may never actually occur again. In other words, a series of apparently sensible decisions, made by a team, balancing a unique multidimensional context, with a unique sequence of events will likely never recur in that same way again.

 

The creation of these protective solutions can have unintended consequences: the diversion of resources from other important tasks (opportunity cost), increased risk in other areas (collateral damage), and inefficiency (600+ protocols) (Andrew Kotaska, MD, School of Population and Public Health, University of British Columbia, 2018, personal communication). It also means that we are always playing catch up, as opposed to creating resilient teams that can “absorb” and catch errors before they occur, or limit the damage when they do occur17 (Michael Gardam, MD, Schulich Executive Education Centre, York University, 2018, personal communication).  Top

 

Looking at safety in the traditional sense is disengaging

 

Safety is understood as a non-event: the absence of harm. In other words if nothing bad happened, the patient was safe. This makes measuring safety nearly impossible. How do you account for something that did not happen? Assuming we knew the denominator, if we determined that a harm event occurred at a rate of 1 in 5000 activities, any intervention put in place to reduce its occurrence, would typically need another 5000 activities to reveal the intervention as effective.18 This is not engaging to the health care team, as it amounts to introducing an intervention and watching for the occurrence of nothing. While important to understand why the system fails, any safety gains achieved in reviewing harm events alone means that quality improvement is reactive to harm, and only picking off at the episodic chances for improvement. This approach is traditional and has become known as protective safety, or safety I.10

 

Current “safety management is based on analyzing situations where something went wrong — a set of snapshots of a system that has failed, described in terms of individual parts or system structures.”19 This is reinforced by typical hospital safety management structures that expect stability, certainty, and predictability and seek to control, manage, and restrict — in other words attempt to engineer-out failure.20

 

The disconnect is that modern day health care does not function in a stable, certain, and predictable environment. Nor does it function on stable, certain, or predictable patients with stable conditions. The status quo safety solution — treating health care like an aircraft carrier or a nuclear power plant — is incongruent with the real environment it purports to protect. Thus, it is highly likely that engineering-out failure alone is not an effective or engaging method of enhancing safety in a complex and adaptive system. Top

 

Culture

 

Likely, the most important factor in making a successful team is culture. On the 15th anniversary of the publication of To Err is Human,1 the National Patient Safety Foundation (NPSF) convened an expert panel to examine and understand the slow pace of improvement in safety. Its report, entitled Free from Harm, Accelerating Patient Safety Improvement Fifteen Years after To Err Is Human,21 identified culture as the key.

 

Culture is often summarized as “what we do around here,” which makes it appear as a monolith that no single person could change. But if we see culture as “what we choose to tolerate,” then each one of us plays a critical role in making the changes necessary to develop the culture we wish to see. Quality improvement in health care is best focused on achieving that culture — a culture by design. The importance of creating a positive culture is supported by a 2017 systematic review published in the British Medical Journal that highlights the positive correlation between culture and patient outcomes.22 Top

 

Knowledge

 

Also discussed in the NPSF report was Schultz’s concept on knowledge.21 He is said to have stated that knowledge moves in three sequential phases: superficial simplicity, confusing complexity, and profound simplicity. Safety in health care is currently in the state of confused complexity.

 

Superficial simplicity was a state in which we naively believed that the adoption of some of the safety approaches from the airline industry (a complicated system, not a complex one) would lead to significant improvements in health care quality and safety.

 

Confusing complexity is the current presence of over 600 policies, guidelines, checklists, and procedures guiding our front line personnel in their day’s work.

 

Profound simplicity is a situation where the right balance of policies and guides exists along with a better understanding of the system and the people we work with — a state yet to be realized. Top

 

Paul Gluck (past chair of both the NPSF and the Council for Safety in Women’s Healthcare) proposed the graph in figure 1, where care becomes less and less safe once a certain number of safety tools are present in the environment. We have passed the crest of the curve and the arrow indicates that we are somewhere on the downslope in terms of safety. Today, there are well-described instances of “checklist fatigue,” and many examples of work as done looking very different from work as prescribed. In short: “A worker following a safety rule can create a condition to enable safety to emerge. Too many safety rules can overwhelm, and frustrate a worker, enabling danger to emerge.”20

 

In Part I,23 we explored why legislating adherence to “rules” was not effective, yet health care organizations continue to respond to safety concerns by producing more and more policies, procedures, and guidelines. However, there seems to be little interest in understanding why following them seems to be so difficult. As an industry, we keep trying different ways for guidelines and best practices to be successfully implemented. This is akin to pushing string uphill. In a complex system, one needs to work with the forces at play, not against them.24 Top

 

Health care is a complex adaptive system

 

Finally, complexity science tells us that health care is a complex adaptive system. This is best explained by first describing what simple and complicated systems are. This is Snowden’s Cynefin framework as adapted by Zimmerman and Gardam2,25:

 

A simple system is baking a cake.

  • Following a list of simple linear tasks/processes will lead to the creation of a cake.
  • The process can be easily taken apart, understood, rebuilt, and the outcome achieved again and again.
  • Safety features can be put in place to reduce the likelihood of failure in achieving the outcome. Those features include lists of ingredients, standardized tools, and process maps (recipes). Importantly, these safety features also increase the likelihood of success. Top

 

A complicated system is placing a human on the moon.

  • Following multiple and intricate linear tasks with interdependencies that must be completed in the correct order and followed to the minute detail will get a human on the moon.
  • The process can be broken down, understood, rebuilt, and the outcome achieved again and again.
  • Safety measures can be put in place to reduce the likelihood of failure in achieving the outcome. Those measures include checklists, and strict adherence to rules and standard operating procedures (these features work to engineer-out failure). Like those of a simple system, these safety features also increase the likelihood of success.

 

Stringent safety measures developed for complicated systems will work in simple systems. The airline industry and nuclear power stations are complicated systems.

 

But as unpredictability increases, a system becomes complex; an example of such a system is raising a child.

  • A complex system is a multidimensional process that cannot be broken down, understood, rebuilt, and the same outcome obtained. There is no guarantee that if you raise one child successfully, the same “recipe” will lead to a second success.
  • Safety measures to reduce the likelihood of failing to successfully raise a child do not exist, per se. Instead, we naturally gravitate toward general principles or simple rules that increase the likelihood of successfully raising the child: love, guidance, nurturing, consistency (these features do not work to engineer-out failure, they work to engineer-in success). Top

 

In a complex system, solutions used for simple and complicated systems have limited utility; they do not reduce the likelihood of failure and may, at times, increase it.26 They can cause harm through unintended consequences. History is rife with examples; Apollo 13 and the landing of US Airways 1549 on the Hudson River are two. In both, a complicated system was disrupted by an unpredictable event. In that moment, they went from being complicated systems to complex ones. In both cases, human ingenuity saved the lives of those involved, whereas following the rules in place would likely have caused the death of all.

 

Finally, a complex adaptive system adapts to meet the needs of its unpredictably changing context. It does so because of the flexibility of its people,24 a fact lost on many except for those doing the flexing. It is a flexibility that is restricted by current safety approaches.

 

Note: although health care is a complex adaptive system, it contains elements of simple and complicated systems. Therefore, further improvements in safety are not about the abolition of work done to date, but a balancing of the right safety tool for the right part of the system within the whole. Top

 

What does all this mean?

 

Complex challenges require a different approach from the ones leaders have been familiar with. This approach must combine critical examination of the system’s components, balanced with a keen understanding of the interactions of the people who work in it.25 An opportunity exists to combine the traditional focus on engineering-out failure (protective safety or safety I) with engineering-in success (productive safety or safety II), which views safety as a dynamic event — analyzing the 4999 times out of 5000 that an activity went well, understanding why it went well, and increasing the likelihood of future successes.10,18,26,27

 

Engaged health care teams are necessary for improvements to occur. But health care teams will not engage because we ask them nicely. Leaders need to fully understand what has led us to this point. Only by understanding our past and current context can we hold the keys to future improvement. Parts I and II of this series of articles have explained our current situation. In Part III, we will bring together those elements and present a way forward to improve engagement in, and the safety of, health care. Top

 

References

1.Kohn LT, Corrigan JM, Donaldson MS, editors. To err is human: building a safer health system. Washington, DC: National Academy Press; 1999. https://www.nap.edu/read/9728/chapter/1

2.Zimmerman B, Reason P, Rykert L, Gitterman L, Christian J, Gardam M. Front-line ownership: generating a cure mindset for patient safety. Healthc Pap 2013;13(1):6-22.

3.Gardam M, Gitterman L, Rykert L, Vicencio E, Bailey E. Five years of experience using front-line ownership to improve healthcare quality and safety (essay). Healthc Pap 2017;17(1):7-23

4.Geary M, Ruiter PJA, Yasseen III AS. Examining the effects of an obstetrics interprofessional programme on reductions to reportable events and their related costs. J Interprof Care 2018. DOI: 10.1080/13561820.2018.1543255

5.James JT. A new, evidence-based estimate of patient harms associated with hospital care. J Patient Saf 2013;9(3):122-8. DOI: 10.1097/PTS.0b013e3182948a69

6.Makary MA, Daniel M. Medical error — the third leading cause of death in the US. BMJ 2016;353:i2139. DOI: 10.1136/bmj.i2139

7.Braithwaite J. Changing how we think about healthcare improvement. BMJ 2018;361:k2014. doi: https://doi.org/10.1136/bmj.k2014

8.Interview: Diane Vaughan, sociologist, Columbia University. Quantorg; 2008. https://tinyurl.com/hxjv2mz

9.Ombredane A, Faverge JM. Le travail prescrit et le travail réalisé. In L’analyse du travail. Paris : Presses universitaires de France; 1955.

10.Hollnagel E. Safety-I and safety-II: the past and future of safety management. Farnham, UK: Ashgate; 2014.

11.Braithwaite J. Resilient health care, volume 3: reconciling work-as-imagined and work-as-done. Boca Raton, Fla.: CRC Press; 2016.

12.Amalberti R. Patient safety as a moving target: implications for improvement strategies. Proceedings from the 1st Latin American Forum.  International Society for Quality in Health Care; 2016.

13.Sands K, Romig M, Dykes P, Schell-Chaple H. Redesigning care — a new playbook to improve quality, safety and patient-centered care. Presented at the 23rd annual American Hospital Association Leadership Summit. Chicago: American Hospital Association; 2015. https://tinyurl.com/y4ha2z2k

14.Cook R, Rasmussen J. “Going solid”: a model of system dynamics and consequences for patient safety. Qual Saf Health Care 2005;14(2):130–4. DOI: 10.1136/qshc.2003.009530

15.Hollnagel E. The ETTO principle — efficiency-thoroughness trade-off. London: Taylor and Francis Group; 2009.

16.Reason J. Managing the risks of organisational accidents. Aldershot, UK: Ashgate Publishing; 1997.

17.Knox G E, Rice Simpson K. Perinatal high reliability. Am J Obstet Gynecol 2011;204(5):373-7. https://doi.org/10.1016/j.ajog.2010.10.900

18.Staender S. Safety-II and resilience: the way ahead in patient safety in anaesthesiology. Curr Opin Anaesthesiol 2015;28(6):735-9. DOI: 10.1097/ACO.0000000000000252

19.Hollnagel E. Resilience in healthcare. In Proceedings of the annual CCSO Quality Conference, Toronto, 23 March 2016. Toronto: Critical Care Services Ontario; 2016.

20.Wong G. 7 Implications of complexity for safety. Safety Differently blog; 2018. https://tinyurl.com/yymn5njc Top

21.Free from harm: accelerating patient safety improvement fifteen years after to err is human. Boston: National Patient Safety Foundation; 2015. Available from: https://tinyurl.com/y3t7mvyc

22.Braithwaite J, Herkes J, Ludlow K, Testa L, Lamprell G. Association between organisational and workplace cultures, and patient outcomes: systematic review. BMJ Open 2017;7(11):e017708. doi:10.1136/bmjopen-2017-017708

23.Ruiter PJA. Disengagement in health care: today’s new culture. Can J Physician Leadersh 2018;5(3):165-9.

24.Braithwaite J, Churruca K, Ellis LA, Long JC, Clay-Williams R. Complexity science in healthcare — aspirations, approaches, applications and accomplishments: a white paper. Sydney: Australian Institute of Health Innovation, Macquarie University; 2017.

25.Gardam M. The complex road to lasting change. Breakfast with the chiefs video. Toronto: Longwoods; 2017. http://www.longwoods.com/audio-video/all/1/7607

26.Johnson A. Framework for better care: proceedings of the 6th Resilient Health Care Meeting. Vancouver: University of British Columbia; 2017.

27.Ruiter, PJA. Implementing patient safety initiatives. Obstet Gynecol Clin N Am 2019;46(2):281-92. https://doi.org/10.1016/j.ogc.2019.01.005 Top

 

Author

P. James A. Ruiter, BMSc, MD, MCFP, is medical director and vice president at Salus Global Corporation, which helps health care organizations achieve better clinical, economic, and operational outcomes through its interprofessional patient safety and quality improvement programs. Dr. Ruiter is also on the knowledge translation and implementation science faculty at the Canadian Patient Safety Institute and, since 2009, has chaired the Obstetrical Content Review Committee of the Society of Obstetricians and Gynaecologists of Canada.

 

Disclosure

Because of its focus, the Salus Global Corporation is not considered a commercial interest under Accreditation Council for Continuing Medical Education standards. It is owned by the Society of Obstetricians and Gynaecologists of Canada, the Healthcare Insurance Reciprocal of Canada, and the Canadian Medical Protective Association.

 

Correspondence to:

james.ruiter@salusglobal.com

 

This article has been peer reviewed.

 

*This series is an expansion of the thoughts and ideas in “Implementing patient safety initiatives,” by the same author.27

 

 Top