Patient Safety Case Study Examples

Posted on by Torr

Abstract

Motivated by pressure and a wish to improve, health care organizations are implementing programs to improve patient safety. This article describes six natural experiments in health care safety that show where the safety field is heading and opportunities for and barriers to improvement. All these programs identified organizational culture change as critical to making patients safer, differing chiefly in their methods of creating a patient safety culture. Their goal is a safety culture that promotes continuing innovation and improvement, transcending whatever particular safety methodology is used. Policymakers could help stimulate a culture of safety by linking regulatory goals to safety culture expectations, sponsoring voluntary learning collaborations, rewarding safety improvements, better using publicly reported data, encouraging consumer involvement, and supporting research and education.

Keywords: Patient safety, quality of care, organizational culture, delivery of health care

In any successful effort to create social change, innovations proliferate, outpacing the ability of the scientific process or the published literature to assess and report their results. This is certainly true of the safety improvement effort in the U.S. health care system at the current time. Motivated by governmental and nongovernmental regulators, pressure from their staffs and patients, and their own wish to do the right thing, leaders of health care organizations around the country are introducing programs to improve patient safety. Many of these are not being evaluated and probably never will be studied with the rigor that academicians would like. Yet when properly interpreted, the histories of these programs may offer valuable insights into where the safety field is heading and opportunities for and barriers to improvement. With this in mind, we conducted case studies of ten potentially promising examples of safety improvement programs in health care institutions around the country. This article briefly reports the characteristics and lessons of six of these ongoing natural experiments in health care safety (see Table 1 for an overview of the cases).

TABLE 1

Case Study Sites, Context, and Sources

To choose our cases, we contacted recognized leaders in the patient safety field and asked them which institutions were, in their view, doing the most exciting work in the patient safety field. Among our informants were individuals from the Institute for Healthcare Improvement, the Joint Commission for the Accreditation of Healthcare Organizations (JCAHO), research funders, and academicians who study patient safety. These informants identified more than twenty potential cases, of which we selected ten for closer study that we considered the most interesting and promising for broad application. We present here six unique cases for analysis; the other four cases describe similar interventions. A fuller description of all ten cases and a list of those persons whom we consulted can be found at the Commonwealth Fund website (http://www.cmwf.org). We collected the information presented in these cases through telephone interviews and written communications with leaders in each organization and from a review of relevant publications (sources are listed in Table 1). We conducted our fieldwork in the fall of 2004. A brief version of our findings was presented on November 4, 2004, at a conference sponsored by the Commonwealth Fund of New York City to mark the fifth anniversary of the publication of To Err Is Human, the Institute of Medicine's seminal report on the patient safety issue.

We should mention at the outset some of the limitations of our work. Our methods were intended to capture a snapshot of the key accomplishments of leading organizations and to synthesize the self-perceived learning of their internal change leaders. Our findings are not meant to be representative of all health care organizations. The cases pertain to the hospital setting, because to date that has been the focus of most patient safety improvement efforts. We did not examine expensive high-technology innovations, such as computerized physician order entry systems, because we wished to concentrate on approaches that would be broadly applicable, regardless of an institution's ability to make large capital investments. Reliance on a small number of key informants means that the cases present a necessarily circumscribed perspective from which it is not possible to critically examine organizational dynamics or test theoretical models. Hence, our work should be considered only a starting point for more detailed organizational analyses.

One overriding lesson of our work has emerged. All the programs we examined cited cultural change—the creation of a “patient safety culture”—within their organizations as critical to making patients safer. Thus the organizations seemed to differ chiefly in the methods they were using to create this safety culture. Some were trying to change culture directly. Others were using less direct methods by relying on particular reforms in the structure or process of care, such as promoting teamwork to improve safety vigilance or introducing methods to reduce variability in the processes of care, and hoping that attitudes would change as behavior changed. Some were using both direct and indirect approaches. But regardless of means they chose, our study organizations shared the goal of cultural reform.

The fact that real organizations undertaking real change are focusing on culture in this way is both reassuring and sobering. As we shall see, the published literature on safety improvement in health care and other fields emphasizes the centrality of organizational culture in developing high-reliability organizations, institutions that are resistant to catastrophic failures that injure clients or patients (Roberts 1993; Weick 1987). Thus it is encouraging that in the health care field, which has come late to the safety movement, this important insight is taking root so quickly, at least in the pioneering organizations. At the same time, the fact that changing the culture lies at the foundation of safety improvement is sobering because there is as yet no proven formula or well-trod path for creating cultural change in health care organizations. Cultural change efforts are also extremely hard to study because culture is difficult to measure reliably and changes seem to occur incrementally and unpredictably in organizations. The lack of a clear road map for cultural change and the very elusiveness of safety culture as a destination make this a particularly daunting goal for leadership to set. Yet leadership commitment, our cases suggest, is essential to success. One function of our cases may be to assure leaders who are taking personal and organizational risks to create a safety culture that they are in good company and to offer ideas and examples that they can take back to their own institutions.

Our article has four parts. First, because the concept of safety culture is central to understanding what currently is happening at the forefront of safety improvement, we briefly review the topic of patient safety culture. Second, we provide abbreviated versions of six of the cases we examined. Third, we analyze the attributes of the safety culture acquired by these organizations. Finally, we provide concluding lessons, observations, and policy implications. Table 1 provides an overview of the case study organizations, and Table 2 defines the key terms we use.

Safety Culture: Its Importance, Definition, and Attributes

In To Err Is Human, the Institute of Medicine noted that “a strong culture of safety … is viewed by many in the safety field as being the most critical underlying feature of their accomplishments” (IOM 1999, 160). Likewise, a review of several high-profile patient safety failures occurring internationally in health care organizations found that “preventing future failures depends on cultural as much as structural change in health care systems and organizations” (Walshe and Shortell 2004, 103). Certain prevailing aspects of health care's organizational and professional culture, such as steep authority hierarchies and a lack of teamwork, an unwillingness to acknowledge human fallibility, and the tendency to punish rather than learn from error, can act as barriers to patient safety and its improvement (Akins and Cole 2005; Sexton, Thomas, and Helmreich 2000; VanGeest and Cummins 2003).

The IOM recommended that health care organizations “develop a culture of safety such that an organization's design processes and workforce are focused on a clear goal—dramatic improvement in the reliability and safety of the care process.” To achieve this goal, “safety must be an explicit organizational goal that is demonstrated by clear organizational leadership and professional support” (IOM 1999, 166).

Although organizational safety culture has no one agreed-upon definition, one used in the nuclear power industry is helpful:

Safety culture is the product of individual and group values, attitudes, competencies and patterns of behaviour that determine the commitment to, and the style and proficiency of an organization's health and safety programmes. Organizations with a positive safety culture are characterized by communications founded on mutual trust, by shared perceptions of the importance of safety and by confidence in the efficacy of preventive measures.

(ACSNI 1993, quoted in IAEA 2002, 14)

Based on research on human error and the ways that organizations have achieved or failed to achieve safety, James Reason (1997) and other experts (Helmreich and Merritt 1998; IAEA 2002) contend that organizations can create a safety culture by identifying and consistently applying practical measures associated with enhanced safety. Changes often begin at the level of the unit or work group, both to test approaches before using them more widely and, given limited resources, to narrow the scope of work. Surveys of safety cultures reveal that an organization's culture varies within units of the same hospital (Sexton 2005). Hence, changing the local culture might require tailoring improvements to the specific local working environment.

As far as we know, no health care organization as yet can offer a model of what other health care institutions should strive to achieve in their safety culture. Accordingly, health care leaders must determine their own objectives. In our view, this makes it all the more important that institutions share their aspirations, efforts, progress, and disappointments in as close to real time as possible. To help analyze what case study organizations have learned, we examined the cases using a framework developed by James Reason (1997), which defines five interrelated attributes for a safety culture: an informed culture, a reporting culture, a just culture, a flexible culture, and a learning culture (see Table 3 for definitions). These attributes have been used to describe the development of a safety culture in other industries and sectors such as nuclear power generation, oil and gas extraction and refining, and commercial and military aviation (Hudson 2003; Reason 1997). The IOM's reliance on Reason's work has made it a touchstone for the patient safety movement, and several of our informants cited these attributes when describing their own work.

TABLE 3

Five Attributes of a Safety Culture

Before presenting the cases, we call attention to an important distinction: several of the cases describe their results as observed or reported numbers of events, which may differ from the number that actually occurred. The fact that errors and their consequences may not be immediately obvious to patients and caregivers is one reason that the health care field lags in pursuing safety. Although medicine has a strong tradition of professional concern for patients' welfare, the complexity of modern systems in which health care professionals perform their work means that they are often unable to see the safety implications of their human and technological interactions. The goal of patient safety is to rectify this system “blindness” by providing the means to detect and eliminate or mitigate system vulnerabilities that could harm patients.

Case Studies

Sentara Norfolk General Hospital

Beginning in 2002, Sentara Norfolk General Hospital (SNGH), in Norfolk, Virginia, tried to improve patients' safety by strengthening its organizational culture of safety. Despite a record of successful technology and process improvements, the hospital's leaders were frustrated that the overall pace and scope of organizational change was not greater. Although changes in culture were deemed the best strategy for improving safety, the lesson of other industries was not to focus on the organization's culture itself but to make safe behaviors a regular practice (Krause, Seymour, and Sloat 1999). To help in this effort, SNGH retained industrial consultants with a reputation for safety improvement in the nuclear power and manufacturing industries.

The consultants conducted a baseline organizational assessment and suggested four strategies to promote safety-related behaviors:

  1. Develop for all hospital staff, physicians, and hospital leaders three sets of behavior-based expectations (BBEs) linked to techniques for preventing errors (e.g., communicate clearly by using repeat-backs and clarifying questions).

  2. Establish “red rules” (high priority rules) to focus employees' attention on the potential for harm if certain critical safety procedures are not followed exactly (e.g., positive identification before taking any action with a patient, verifying the site before surgery).

  3. Improve the staff's ability to conduct timely and rigorous “root cause analysis” (see Table 2 for definition) of major safety events, such as by identifying the common contributing causes of a series of events, so that these analyses identify long-lasting, systems-oriented change.

  4. Adopt a human factors approach to simplifying policies and procedures (e.g., using a checklist to identify and standardize the principal steps in a process).

A “grassroots” group of employees developed the hospital staff and leadership BBEs, and another group of physicians and nurses created the physician BBEs. The groups also reviewed and adapted proven error management behaviors, tools, and techniques from other high-risk industries to fit the health care environment.

To help make expected behaviors become habits, supervisors regularly offer feedback on BBEs, which serve as core competencies for staff performance reviews. Managers informally observe progress when they make walk rounds in hospital units. Trained observers from the hospital's clinical effectiveness department use validated tools adapted from other high-risk industries to determine whether opportunities for applying behaviors are handled appropriately at critical safety junctures, such as during shift-change reports. These observations form the basis for measuring overall organizational progress on BBEs through a system called Real-Time Behavior-Based Monitoring, which is one of the several safety-related measurements in the hospital's integrated performance indicator system.

The preliminary results of these efforts include a 42 percent increase in the use of expected communications behaviors by hospital staff from 2003 to 2004, an 84 percent reduction in ventilator-associated pneumonias from 2001 to June 2004, and a 63 percent decrease in the rate of device-associated bloodstream infections from 2002 to June 2004. Based on SNGH's experience, Dr. Gary Yates, vice president and medical director of clinical effectiveness, said that health care leaders need to exhibit “measured impatience” if they want to see breakthroughs in patient safety and move to the next level of organizational performance.

U.S. Department of Veterans Affairs

The Veterans Health Administration, a component of the U.S. Department of Veterans Affairs (VA), established its National Center for Patient Safety in 1999 to lead the change in the organizational culture by empowering local facilities and frontline staff with proven tools, methods, and initiatives to improve patients' safety. This culture change program used human factors principles and the experience and lessons from high-reliability industries such as aviation and nuclear power. It was initiated as part of a broader organizational transformation undertaken by the VA in response to public and congressional concerns about the quality of care in VA facilities. The safety program's main components were:

  • Establishing a nonpunitive approach to patient safety that protects the confidentiality of those who report unintended errors, which are clearly distinguished from blameworthy acts. With the cooperation of Congress, its unions, and the Joint Commission for the Accreditation of Healthcare Organizations (JCAHO), the VA defined a blameworthy act as a criminal act, an act related to alcohol or substance abuse or patient abuse, or an intentionally unsafe act that an individual knew to be unsafe. Such acts are reported to the local facility administration for investigation and possible disciplinary action.

  • Encouraging the reporting of both adverse events and close calls (see Table 2 for definitions). VA employees are asked to report safety events to their facility's patient safety manager. The employee who makes these internal reports remains “identified” until the root cause analysis is completed so that the employee can be notified of and comment on the findings. If an employee is not comfortable reporting internally, he or she may report a safety event to an external reporting system operated by the NASA Ames Research Center on the VA's behalf. This external patient safety reporting system is modeled on the Aviation Safety Reporting System, which NASA operates for the Federal Aviation Administration. Employees making external reports are asked to identify themselves so that they can be contacted if additional information is needed, but the reports are subsequently “de-identified.” In this way, the external system protects the employees' anonymity.

  • Designing and providing training on easy-to-use, computer-aided root cause analysis tools and cognitive aids for multidisciplinary teams of frontline staff to analyze reported safety events in each facility. The goal is to determine what happened, why it happened, and what can be done to prevent it from happening again. The National Center analyzes similar events to decide how to address common issues throughout the system.

  • Adapting a systems engineering tool, known as “failure modes and effects analysis” (see Table 2 for definition), to discover critical system vulnerabilities and to design and assess improvements that will prevent and reduce harm to patients.

  • Disseminating throughout the VA warnings about potential safety threats and lessons learned about effective system improvements.

Within ten months of enhancing its internal patient safety reporting system in 1999, the VA experienced a thirtyfold increase in the reporting of events, indicating that the promise of confidential, nonpunitive reporting was important to the workforce. The relatively small number of reports received by the external reporting system (fewer than four hundred since its inception in 2002, compared with hundreds of thousands of safety events reported internally) suggests that the VA has achieved a high level of trust in its internal reporting system. Since the creation of enhanced tools and training, nearly all root cause analyses have been able to recommend a solution, whereas previously about half the reviews of patient safety events failed to identify a solution—a 100 percent increase in the perceived preventability of events. In addition, the types of root causes identified have shifted from patients' behavior and professional training to human factors and systems issues, leading to what managers regard as more effective and sustainable solutions. These trends suggest that teams no longer consider the circumstances giving rise to errors as impossible to change. One measure of the change in the VA's culture is that its personnel seem to be applying what they have learned about systems thinking to other areas, and they want to set and achieve safety goals that exceed the JCAHO's accreditation requirements, said Dr. James Bagian, director of the VA National Center for Patient Safety.

Kaiser Permanente

Kaiser Permanente (KP) instituted a program of organizational learning in 2002 to promote teamwork and communication in high-risk areas such as surgery and labor and delivery. Clinical leaders were taught safety-oriented principles and techniques adapted from the U.S. Navy and from airline crew resource management training (Helmreich 2000) (see Table 2 for definition).

At the KP Anaheim, California, Medical Center, a multidisciplinary team designed a preoperative safety briefing to enhance basic patient safety practices required by the JCAHO, such as the “time out” to verify surgical sites. A one-page checklist was created to prepare team members for cases, including practices to assess and mitigate safety risks. Analogous to the preflight checklist used in the airline industry, this checklist is adapted to the needs of each case and is posted throughout the operating theater as a reminder. Operating room personnel are periodically trained in human factors principles, followed by a short self-assessment for reinforcement.

Four northern California KP medical centers piloted a perinatal patient safety project (PPSP) in which they received training on safety sciences and instituted improvements to close local gaps in the characteristics of a high-reliability perinatal unit (Knox, Simpson, and Garite 1999). The principal interventions were:

  • Multidisciplinary patient rounds to ensure that the care plan was understood.

  • Assertive and structured communication techniques to promote accurate situational briefings (such as when interpreting and responding to fetal distress).

  • A communication escalation policy defining how safety concerns can be forwarded through the chain of command to avoid delays in responding to critical events.

  • Team briefings before a procedure and debriefings following an adverse outcome.

A multidisciplinary team developed a critical-event team-training program to practice teamwork and communication skills in simulated crisis situations using computerized mannequins. Complex training scenarios are based on actual cases that force the team to recover from errors. Drills are videotaped for debriefing, during which the team sees where improvement is needed.

After a six-month trial of preoperative briefings at the Anaheim Medical Center, no wrong-site surgeries were reported (three were reported in the previous year, but the small number of events means that no firm conclusions can be drawn). At the same time, other reported error management behaviors increased, such as a willingness to speak up about safety concerns and to report and discuss mistakes, suggesting that team members had better “situational awareness” (see Table 2 for definition). Comparing responses to the University of Texas Safety Attitudes Questionnaire (Sexton et al. 2004) made before and after the intervention indicates that operating room staff perceived an improvement in the safety culture and teamwork. The turnover rate among nurses fell by two-thirds (from 23 percent to 7 percent) and has been sustained at a lower level than that in comparison contract hospitals. One year following the implementation of the PPSP, the labor and delivery staff in all four perinatal sites rated the safety culture more highly than before the intervention. (Several years of data will be required to measure the effect of the PPSP on perinatal adverse events.)

Briefings are a powerful way to change the way that people think about and practice teamwork, said Dr. James DeFontes, physician director of surgical services for KP Orange County. Explicit communication helps people attend to the task at hand, bridges cognitive gaps in training and experience levels, and avoids unjustified assumptions about others' knowledge, he added. Improving team communication is, ultimately, about organizational cultural change. Effective change requires a “bottom-up approach” supported by leadership and physician involvement, combined with ways of inculcating expected behaviors in everyday practice, Dr. Michael Leonard, physician leader for patient safety at Kaiser Permanente, said.

Missouri Baptist Medical Center

Missouri Baptist Medical Center, in St. Louis, tried to improve patient outcomes by enhancing its existing emergency response capability to intervene early with patients showing signs of medical deterioration before they suffered crises such as cardiac or respiratory arrest. The hospital's leaders were impressed by the benefits of rapid response teams (also known as medical emergency teams) in reducing adverse events in Australian hospitals (Bellomo et al. 2003, 2004) and a few institutions in the United States (IHI 2004a). In 2004, they made a similar effort in their hospital. Ideas for their plan were obtained from collaboration with the Institute for Healthcare Improvement and from the experience of other adopters. The intervention was carried out during an organization-wide culture change initiative aimed at aligning the hospital's mission, structure, reward system, and leadership with patient safety goals and building a partnership with affiliated physicians to institute safety improvements.

Missouri Baptist's rapid response team consists of a physician assistant acting as the team leader, a critical care nurse, and a respiratory therapist. The team is empowered to take whatever action is needed to stabilize the patient, within the scope of their practices, while reporting to and consulting with the hospital's ICU physician intensivist as needed. The patient's attending physician is notified and informed of the team's assessment and may order additional tests at his or her discretion. Patients are transferred to the ICU or to an intermediate care unit when appropriate.

The hospital staff were taught the purpose and use of the rapid response team before it was put in place, a step that the hospital's leaders considered necessary to its success. Specific clinical criteria were compiled to guide the nursing staff in deciding when a patient's condition warranted calling for help. In addition, anyone could call for help when he or she was simply worried or concerned about a patient. The benefit of the rapid response team was reinforced by sharing stories of successful interventions.

After two months of full implementation, calls for the rapid response team steadily increased to about seventy to eighty per month, indicating that the floor nurses recognized the value of this safety resource. During this time, there was a 60 percent decrease in emergency calls for respiratory arrest and similar crises and a 15 percent decrease in cardiac arrests, suggesting that these acute crises were being averted through early intervention. Anecdotal feedback from family members indicated they were impressed and gratified by efforts made on behalf of the patients, according to Nancy Sanders, the hospital's performance improvement coordinator.

A rapid response team should be considered a support resource for frontline staff, observed Dr. John Krettek, vice president of medical affairs. Incidents can be used as a teaching opportunity for the ICU nurse to share insights into the case so that the floor nurse can sharpen his or her assessment skills and learn how to respond to a similar event in the future. In this way, rapid response teams may promote a culture of safety by building teamwork and spreading knowledge and skills throughout the hospital.

Johns Hopkins Hospital

In 2001, intensive care unit (ICU) physicians and staff at the Johns Hopkins Hospital in Baltimore developed its Comprehensive Unit-Based Safety Program (CUSP) as a framework for quality improvement that can be tested, adapted, and introduced sequentially in the hospital units. This program is part of a broader institutional commitment to improve patient safety that was partly based on a partnership with the family of a pediatric patient, Josie King, who was a victim of medical error at the hospital. The King family donated funds and worked with Hopkins physicians to create a patient safety program in the hospital's Children's Center, which has served as a model for improvement.

CUSP has eight steps: to (1) assess the unit's culture of safety, (2) educate staff about safety sciences such as systems thinking, (3) identify safety concerns, (4) meet regularly with a senior hospital executive who supports the removal of system barriers, (5) prioritize and implement improvements, (6) document and analyze results, (7) share success stories, and (8) reassess the unit's safety culture.

Improvement teams (consisting of a physician, nurse, and administrator, plus other staff who wished to join) spent time each week (four to eight hours) identifying and promoting safety improvement efforts. For example, a short-term patient goals form was drawn up based on a survey finding that nursing staff and residents frequently did not know the goals of patients' therapy. The form is used as a checklist during physician-led rounds to identify tasks that need to be completed by the care team and to identify and mitigate safety risks. A related project aimed to reduce bloodstream infections associated with the use of central venous catheters, which are often inserted in ICU patients to provide medication, nutrition, and fluids. A multidisciplinary team decided on the following interventions:

  • Require providers to receive education about evidence-based infection control practices and successfully complete a posttest as a precondition to inserting catheters.

  • Supply a catheter insertion cart with standardized supplies needed to meet infection control guidelines for the sterile insertion of catheters.

  • Follow a checklist to ensure adherence to evidence-based guidelines for safe catheter insertion.

  • Empower nurses to intervene if guidelines are violated.

  • Add an item to the daily goals sheet that prompts the ICU team to ask physicians daily during patient rounds whether catheters can be removed.

After the daily goals sheet was introduced, self-reported understanding of goals of care increased from 10 percent to 95 percent of residents and nurses during an eight-week period. One year following the implementation of the CUSP initiative, the average ICU length-of-stay fell by one day in one ICU and by two days in a second ICU. Medication errors were eliminated in orders to transfer patients out of the ICU. The proportion of the ICU staff who gave positive ratings of the safety culture rose by nearly half in one ICU and nearly doubled in the other, as measured on the Safety Climate Scale (Sexton and Thomas 2003). Senior executives' involvement with the ICU led to structural changes, including the creation of specialized patient transport teams and the presence of pharmacists in ICUs. Documented catheter-related bloodstream infections were eliminated, preventing an estimated forty-three infections and eight deaths and saving an estimated $2 million annually.

CUSP is now being used as a framework for patient safety improvement throughout Johns Hopkins Hospital. Initiating change efforts within a single work unit and then replicating successful approaches in other units appears to be a promising approach to building an organizational culture of safety in the Hopkins system. “When you create a system that reliably delivers the processes or interventions that work, spectacular performance improvement follows,” Dr. Peter Pronovost, medical director of the Johns Hopkins Center for Innovations in Quality Patient Care, said in a recent interview for the Joint Commission Journal on Quality and Safety (Berman 2004, 663).

OSF St. Joseph Medical Center

OSF St. Joseph Medical Center, in Bloomington, Illinois, used several strategies to promote an organizational safety culture: the widespread reporting of errors, systems thinking about preventing errors, and collaborating for improvement. Many ideas were derived from participation in the Institute for Healthcare Improvement's Quantum Leaps in Patient Safety collaboration, which the hospital joined in 2001. For example, safety concerns and ideas for improvement are elicited from nursing staff during safety briefings at shift changes and through executive walk rounds (Frankel et al. 2003), during which hospital executives periodically meet with frontline staff in each of the hospital's work units.

For its first patient safety effort, the hospital instituted several ways of reducing medication errors. An enhanced medication reconciliation process was established as the foundation for preventing adverse drug events (Rozich and Resar 2001). This process had the following steps:

  • A nurse interviews the patient or the patient's advocate at hospital admission to obtain the best possible information about home medication use.

  • When patients are transferred between hospital units or discharged from the hospital, their existing medications are compared with those ordered by the physician to be continued.

  • Any discrepancy between medications ordered and currently being taken at hospital admission, transfer, or discharge must be reconciled by the patient's physician within four to twenty-four hours, depending on the type of medication.

  • A pharmacist reviews the patient's home medication use and the physician's orders to detect and avoid any medication errors in dosing or other problems such as potential drug interactions.

An automated standard medication reconciliation sheet doubles as the medication order sheet for review and approval by the physician at admission, transfer, and discharge, which saves time and prevents transcription errors. A duplicate copy is given to the patient to take home, and another copy is sent to the patient's referring physician.

Other improvements in the medication process used human factors principles, such as standardized order sets and dosing services for certain high-risk medications, to reduce complexity and unnecessary variation that could lead to errors and harm to the patient. Pharmacists on nursing units perform independent double checks while entering medication orders, which permits them to detect dosing errors or potential drug interactions and to immediately contact physicians for resolution. A systematic risk assessment (known as “failure modes and effects analysis,” see Table 2 for definition) is used to identify vulnerabilities in the medication-dispensing process and to design countermeasures. For example, to prevent the reuse of leftover medication, the floor nurse places it in a plastic bag, which is picked up by pharmacy technicians during hourly rounds on the floor. The telephonic Adverse Drug Event Hotline was instituted to simplify reports of adverse drug events and permit pharmacists to analyze potential problems each day.

Following these interventions, the rate of adverse drug events dropped by 91 percent (from 5.8 to 0.5 per 1000 doses) from June 2001 to May 2003, as detected using a “trigger tool” for sampling medical records developed by the Institute for Healthcare Improvement (IHI 2004b). The hospital's perceived safety culture also improved, as measured using the University of Texas Safety Climate Survey (Sexton and Thomas 2003).

Discussion: Acquiring a Safety Culture

The cases illustrate how health care organizations are working to instill the five characteristics of high-reliability safety culture identified by James Reason (1997) and defined in Table 3. They are seeking, first of all, to become informed about system vulnerabilities that threaten patients' safety so that they can plan and prioritize system improvements. Sentara Norfolk General Hospital and OSF St. Joseph Medical Center, for example, have integrated performance indicator systems that track progress in meeting safety goals by measuring safety-related attitudes, behaviors, events, risks, and outcomes using multiple data sources such as surveys, assessments, incident reports, direct observation, medical records, and malpractice claims. At the apex of this system, executives at these hospitals make walk rounds on hospital wards to cultivate an awareness of safety issues and demonstrate to the staff that safety is important. The kind of top-to-bottom organizational safety audit conducted by Sentara Norfolk General Hospital appears to be a valuable but underused resource for detecting weaknesses and opportunities for improvement.

To support an informed culture, case study organizations encourage the reporting of safety incidents and concerns (including near misses or close calls) through internal and external reporting systems, safety briefings, and executive walk rounds. They appear to have moved beyond a superficial preoccupation with the volume of reports as a barometer of safety culture to a more mature outlook that values reports for the learning that they enable when accompanied by effective analytic tools. For example, insights obtained by analyzing close calls, which occur more frequently than adverse events, can be used to discover weaknesses in the system before they harm patients. To encourage participation and build trust in the efficacy of reporting, case study organizations have established feedback loops to discuss with both management and frontline staff what they are learning and doing as a result. Although several organizations use anonymous reporting systems, the experience of the Veterans Health Administration suggests the feasibility of a confidential reporting system in which incident reporters identify themselves so that they can be included directly in the feedback loop.

These organizations recognize the necessity of providing a psychologically safe environment for reporting medical errors so that mistakes can be identified, learned from, and prevented rather than hidden out of fear of punishment. Some have characterized this as a “blame-free” environment for patient safety. Others have more carefully defined their aspiration as a nonpunitive, or just culture, that protects the reporting of honest errors while recognizing that misconduct—in which an individual intentionally endangers patients—is not an error (Leape 2005). The Veterans Health Administration, for example, found that its employees were willing to greatly increase the reporting of safety incidents under a credible promise of confidentiality that clearly defined misconduct and excluded it from protection.

A flexible culture encourages greater teamwork and collaboration across disciplines to help maintain and improve patients' safety under the often complex and demanding delivery of health care (Leonard, Graham, and Bonacum 2004). Kaiser Permanente and Johns Hopkins Hospital, for example, are attempting to flatten the hierarchy by training and encouraging surgical and critical care team members to “speak up for safety” when they observe an unsafe situation and to cross-check one another's work to prevent errors. Kaiser Permanente has learned that training obstetrical teams using simulated emergencies gave them a new awareness of how their behavior affected others and built appreciation for teamwork in critical situations. Cross-cutting interventions such as the rapid response team at Missouri Baptist Medical Center promote a flexible culture by facilitating learning and information sharing across traditional organizational boundaries where breakdowns in communication and safety can often occur. These kinds of role changes may lead to what Karl Weick (2002, 193) called a “mindful interdependence” in which people “subordinate their idiosyncratic intentions to the effective functioning of the system.”

Finally, study organizations seem to be promoting a learning culture by undertaking system reforms based on data and knowledge gleaned from both inside and outside their local environment. Providing easy-to-learn analytic tools and training for frontline staff to draw useful conclusions and take action based on incident reports promoted a sense of local ownership for improvement and was “the single most noteworthy means of changing culture” within the Veterans Health Administration, said Dr. James Bagian, director of its National Center for Patient Safety. The VA encourages an action orientation by requiring local executives to reach an agreement with incident investigation teams on remedies that will be taken to address identified vulnerabilities. Kaiser Permanente and the OSF St. Joseph Medical Center use safety briefings and debriefings as a means of ongoing, self-guided team learning and error management. The use of risk assessment tools supports learning at OSF St. Joseph Medical Center by helping teams find weaknesses in the system and objectively weigh options for improving their reliability, thereby avoiding “quick fixes” that might actually increase harm. Sentara Norfolk General Hospital is combining direct observation of behaviors with coaching and feedback to promote the practical learning of new safety-enhancing behaviors that, over time, can become an internalized safety culture.

The organizations we studied differed in approaching culture change as an organization-wide project or through a team- or unit-based undertaking. The experience of Sentara Norfolk General Hospital suggests that a comprehensive change strategy can help accelerate the pace and scope of organizational change. Such an approach is likely to require much more upfront investment of staff time and resources. But the hospital also found that corporate principles had to be adapted and embedded in each unit's specific work. Those taking a unit-based approach, such as Johns Hopkins Hospital, find that discrete achievements build interest and momentum for introducing change in other units. Dr. Peter Pronovost, a patient safety change leader at Johns Hopkins, explained that the dissemination of change requires a commitment to apply scientifically valid safety principles within a flexible structure that relies on local wisdom to determine improvement priorities. The intersection between unit-based and organization-wide culture change should be better understood as unit-based efforts are replicated throughout hospitals. It is likely that change will be uneven across units. Hospital managers might be able to use culture surveys and techniques like executive walk rounds to identify and focus attention on units in need of greater improvement so as to produce a more consistent safety culture across the organization.

A fair question is how discrete efforts at improvement add up to organization-wide change. Patient safety practitioners such as Kathy Haig at OSF St. Joseph Medical Center describe a gradual process in which specific improvements produce cascading effects that build higher levels of awareness and engagement in management and the workforce. With sustained effort over several years, there is a sense of gaining traction, that management is really “walking the talk,” and that frontline staff and physicians are getting on board. Several case study organizations found that introducing improvements through small-scale tests of change helped pave the way for institutionalizing new safety practices while minimizing the risk of failure by involving staff in giving feedback at incremental stages of the implementation.

Lessons Learned

Change agents in the case study organizations learned much from their experience implementing patient safety improvements. These lessons include:

  • Apparently dramatic improvements in safety, such as the elimination of documented catheter-related bloodstream infections in the ICU and a tenfold reduction in detected adverse drug events, seem to challenge the assumption that adverse events must be tolerated as an inevitable side effect of health care.

  • Safety principles and techniques developed in other industries, such as airline crew resource management training, may be applicable to health care for similar human factors issues. Simple human factors engineering approaches, such as the standardization and simplification of processes and independent checks to catch errors, often seemed to be effective. Nevertheless, knowledge and tools must be adapted to fit the culture of medicine and the particular organizational context and safety threat (Musson and Helmreich 2004).

  • Safety awareness and vigilance can be taught by training and coaching staff to use practical skills, tools, and behaviors so that they gain the ability and confidence to identify safety threats and mitigate their causes, in both real-time work and later analysis. A safety culture is inculcated as these ways of systems thinking and behaving become the norm for individuals and teams. One practical result is to greatly increase the number of events that staff regard as preventable.

  • An organization's leadership can motivate and support a “bottom-up” approach to safety improvement among physician leaders and other frontline clinical staff. Some improvements require direct financial investment, and all require dedicated staff time to plan and implement. Several organizations noted the importance of repeatedly telling stories about successful improvements to introduce and reinforce desired cultural values and behaviors and to build momentum for change.

  • Focusing on patients' needs can be a powerful motivator for change. Some of the organizational changes described in the cases required renegotiating traditional roles. Resistance to change was typically overcome through education on the evidence for a practice and by focusing on an intrinsic motivator—doing what is best for the patient.

  • Seeking and measuring improvement in both systems (e.g., reduction of errors and vulnerabilities) and outcomes (e.g., reduction of adverse events) seems to enhance sustainability by validating clinical success factors while helping make a valid and meaningful case for patients' safety. Because many errors do not harm patients, it is important to concentrate on activities that will have the largest impact on improving safety.

Policy Implications

What has induced the organizational leaders we studied to undertake these efforts, and are these motivations applicable to other organizations? Interviews suggest that these institutions are internally motivated to perform well. Reputational rewards appear paramount, typically to fulfill high public expectations but, in some cases, as part of a wider effort to shore up reputation. Several organizations cited the financial and operational benefits of safety improvement, such as reductions in hospital length of stay and nursing staff turnover. Accreditation requirements are viewed by these leaders as a floor on which to build. In contrast, accreditation requirements appear to be the primary driver of safety efforts in most hospitals (Devers, Pham, and Liu 2004).

Can a culture of safety in health care be encouraged so that these examples become the norm? The success of the internal mechanisms for building a safety culture in health care organizations, as described in these cases, seems more likely if it accords with the external policy forces influencing these organizations and the professionals who work in them. We offer the following observations for how policymakers in both the public and private sectors might build on intrinsic motivations to encourage a culture of safety through responsive regulation, voluntary collaboration, purchaser incentives, public reporting, consumer involvement, education, and research. Given that so little is known about safety culture, we offer our policy recommendations cautiously. Ideally, they should stimulate creative learning and continuing innovation to address evolving challenges over time.

Link Safety Goals to Safety Culture

The JCAHO appears to be acting as a proxy for government regulation in the safety sphere, with a stated goal of encouraging a culture of safety through continuous improvement (O'Leary 2003). In 2001 the JCAHO required that accredited institutions meet general patient safety standards such as creating a culture of safety and implementing a patient safety program. Since 2003, an evolving set of specific patient safety goals, such as establishing a medication reconciliation process, has been incorporated into accreditation requirements.

The JCAHO might capitalize on its position of influence to help health care organizations decide how they can apply specific patient safety goals to meet safety culture expectations. For example, the JCAHO might convene stakeholders to develop (and periodically update) guidance on how to create a safety culture in health care organizations based on lessons learned from leaders in the field and on practical improvement and culture change strategies that have been shown to work under particular circumstances and that can be instituted over time. Such guidance would need to be designed and used as a road map offering different routes to a common goal.

Hospitals could use safety culture guidance to structure organizational assessments for learning and improvement. Creating a repository linking assessment findings to standardized performance data and safety attitude surveys could help to identify potentially promising safety practices associated with good outcomes. Initially this could be done on a voluntary, collaborative basis. If this approach were successful, the JCAHO and its stakeholders might consider whether accreditation would be enhanced by incorporating a safety culture assessment into the accreditation survey process, as either a standard or an optional component. Alternatively, the JCAHO might certify third-party experts or methods to perform assessments. In any case, the goal should be to improve patients' safety in ways that are responsive to local needs and priorities.

Encourage Collaboration

Voluntary efforts to spread evidence-based practices, such as the Institute for Healthcare Improvement's 100,000 Lives Campaign (IHI 2005b), illustrate the potential for setting bold improvement goals when stakeholders band together to sponsor and participate in an initiative. Indeed, Gosfield and Reinertsen (2005) argue that the scientific basis and wide acceptance of the campaign's goals makes them de facto national standards of care that are likely to be enforced through tort liability against hospitals that fail to undertake them. Other public-private partnership opportunities to disseminate practical knowledge and speed interorganizational learning have involved Medicare Quality Improvement Organizations (Dellinger et al. 2005), regional and statewide coalitions of stakeholders (Comden and Rosenthal 2002; Sirio et al. 2003), and recognition programs such as the Malcolm Baldrige National Quality Award and its state-level counterparts (McCarthy 2005b).

Offer Incentives

The financial returns from improvement often do not fully compensate health care providers for the cost of their efforts (Leatherman et al. 2003; O'Leary 2003). In response, health care purchasers—including federal and state governments, large self-insured employers, and health plans and insurers—are experimenting with pay-for-performance programs to reward health care providers for improvements in quality or attainment of quality goals (Rosenthal et al. 2004). Given the newness of incentive approaches, there is as yet little evidence of their effectiveness (Dudley et al. 2004; Galvin et al. 2005). Payment incentives might help stimulate a safety culture if enough public and private purchasers participate, the reward recognizes efforts to achieve accepted safety goals, and providers can share substantially in the savings achieved by their efforts—in other words, if rewards are perceived as real, important, and attainable. Other, simpler incentives are defraying the cost for institutions to participate in collaborative improvement initiatives (McCarthy 2005a) and offering providers a discount on malpractice liability insurance for participating in safety-enhancing activities, such as crisis management teamwork training in labor and delivery units (McCarthy and Staton 2005).

Leverage Public Data

Almost half the states require hospitals to report certain adverse events and incidents to a state agency, typically for the purpose of accountability (Wood and Nash 2005). Although in the past these data often were not systematically collected or analyzed, recently a few states have created patient safety centers to analyze data, educate health care providers and the public, foster collaboration, and stimulate a culture of safety (Rosenthal and Booth 2004). If experience shows that such centers are able to use state reporting data to disseminate new information about critical safety threats and promote worthwhile improvements, then their creation should be encouraged. The aviation industry's experience suggests that a centralized safety reporting system can improve safety awareness if it is nonpunitive, confidential, independent, and easy to use and it produces timely, expert, and systems-oriented feedback (Leape 2002). The newly enacted federal Patient Safety and Quality Improvement Act may contribute to the achievement of these goals by protecting the voluntary, confidential reporting of safety data to independent, federally certified patient safety organizations.

A few states now publicly report hospital-specific data on certain adverse events or hospital-acquired infections (IDSA 2005; Marchev, Rosenthal, and Booth 2003), which supporters advocate as both a matter of the public's right to know and a means to empower consumer choice and induce improvement. The public reporting of hospital-specific clinical quality data has been associated with better quality of care (Hibbard, Stockard, and Tusler 2005). Adverse event rates, however, might not be comparable if a higher rate reflects better reporting rather than worse quality. Moreover, some hospital leaders fear that nonconfidential public reporting might discourage internal error reporting (Weissman et al. 2005). The publication of process measures, such as the Leapfrog Group's voluntary survey of hospital compliance with recognized safety practices, appears to be more widely accepted.

Involve Consumers

Consumers have a potentially compelling stake in promoting a culture of safety in health care (NPSF 2003), but efforts in this regard are still nascent. The federal government, the JCAHO, advocacy groups, and hospitals are encouraging patients to be vigilant about their care and assertive in protecting themselves from errors. Patients may need education and coaching to gain the understanding and confidence needed to take recommended actions, such as questioning their health care professional (Hibbard et al. 2005). Some hospitals are including patients in interdisciplinary rounds held at the bedside to increase the detection of errors (Uhlig et al. 2002). Other hospitals, such as the Dana-Farber Cancer Institute, recruit patients to serve on Patient and Family Advisory Councils that help design and evaluate organizational policies, systems, and quality improvement initiatives (Ponte et al. 2003). Respect for patients and a desire to maintain their trust demand that hospitals and health professionals disclose the occurrence of adverse events to affected patients and their families. Current disclosure practices may not be adequately meeting patients' desire for an honest accounting and emotional support (Gallagher et al. 2003).

Advance Education and Research

Medical and nursing education are essential to shaping health care professionals' safety-related attitudes and behaviors by instilling an understanding of systems thinking about error and an appreciation for human factors such as teamwork and communication that can help keep patients safe. Federal funding for the Agency for Healthcare Research and Quality is vital to support research on safety culture and for the development of valid metrics and tools with which to measure and promote it (Clancy 2005). To help guide policy, new research is needed on how specific policy actions influence organizational behaviors and safety outcomes.

Conclusions

Organizational leaders in patient safety are undertaking various interventions to instill a safety culture, and the organizations we studied view culture change as a key element and accomplishment of their safety improvement efforts. These organizations appear to view safety culture as an aspirational system property that is more than the sum of their individual safety improvement efforts. The reason for this outlook appears to be a belief that a sustained safety culture allows continuing innovation and improvement that will transcend whatever particular safety methodology is used at any particular time.

The cases describe many impressive results, but they do not permit us to draw firm conclusions about the effects of culture vis-à-vis other potential causal explanations for the observed outcomes. All the case study organizations were part of integrated health care systems or multihospital systems that might offer advantages, such as greater in-house expertise, physician commitment, and financial resources for carrying out organizational change strategies. Nevertheless, the changes described are not beyond the reach of independent institutions. We examined other cases in which community hospitals made similar safety improvements when they were part of a supportive learning collaboration (McCarthy 2004).

Creating an organizational culture of safety may be both foundational to safety improvement and elusive as a goal. The process of instilling a safety culture requires the integration of both systems engineering and social engineering skills. A paradoxical insight is that the adoption of specific improvements both furthers—and is furthered by—organization-wide culture change. All the tools and techniques described in these case studies display promise in promoting a culture of safety when used as part of an intentional strategy to move the organization toward that goal. Yet as James Reason (1997) warned, safety improvement tools and techniques are not pieces of machinery that can simply be plugged in, turned on, and then forgotten. Rather, they are part of an organic organizational growth process that must be nurtured over a prolonged period of time.

Perhaps the most sobering discovery from our cases is that those who are furthest along the patient safety pathway report that their learning has made them realize how much more they need to accomplish to achieve a truly safe environment for patients. A question for practitioners, policymakers, and researchers is whether the kinds of approaches described here will provide the impetus to change the culture of health care in positive and enduring ways. Because of the dynamic and emergent nature of the safety field in health care, ongoing assessment will be valuable to take stock of progress and new learning as experience accumulates over time.

Acknowledgments

This research was supported by a grant to Massachusetts General Hospital/Partners Health System by the Commonwealth Fund. The authors acknowledge the Commonwealth Fund for grant support for this research, the interviewees who shared their organizational experiences (see Table 1), the experts who recommended cases for consideration (names can be found at http://www.cmwf.org), and three anonymous reviewers who provided valuable suggestions for improvement. The views expressed in this article are the authors' and do not necessarily reflect the opinions of the Commonwealth Fund or its directors, officers, or staff.

References

  • Advisory Committee on the Safety of Nuclear Installations (ACSNI) Organising for Safety. ACSNI Study Group on Human Factors. London: Health and Safety Executive; 1993. Third Report, Quoted in IAEA 2002.
  • Agency for Healthcare Research and Quality (AHRQ) Patient Safety Network Glossary. Rockville, Md: U.S. Department of Health and Human Services; 2005. Available at http://www.psnet.ahrq.gov/glossary.aspx.
  • Akins RB, Cole BR. Barriers to Implementation of Patient Safety Systems in Healthcare Institutions. Journal of Patient Safety. 2005;1:9–16.
  • Bagian JP, Gosbee J, Lee CZ, Williams L, McKnight SD, Mannos DM. The Veterans Affairs Root Cause Analysis System in Action. Joint Commission Journal on Quality and Safety. 2002;28:531–45.[PubMed]
  • Bagian JP, Lee C, Gosbee J, DeRosier J, Stalhandske E, Eldridge N, Williams R, Burkhardt M. Developing and Deploying a Patient Safety Program in a Large Health Care Delivery System: You Can't Fix What You Don't Know About. Joint Commission Journal on Quality and Safety. 2001;27:522–32.[PubMed]
  • Bellomo R, Goldsmith D, Uchino S, Buckmaster J, Hart GK, Opdam H, Silvester W, Doolan L, Gutteridge G. A Prospective Before-and-After Trial of a Medical Emergency Team. Medical Journal of Australia. 2003;179:283–87.[PubMed]
  • Bellomo R, Goldsmith D, Uchino S, Buckmaster J, Hart G, Opdam H, Silvester W, Doolan L, Gutteridge G. Prospective Controlled Trial of Effect of Medical Emergency Team on Postoperative Morbidity and Mortality Rates. Critical Care Medicine. 2004;32:916–21.[PubMed]
  • Berenholtz SM, Pronovost PJ, Lipsett PA, Hobson D, Earsing K, Farley JE, Milanovich S, Garrett-Mayer E, Winters BD, Rubin HR, Dorman T, Perl TM. Eliminating Catheter-Related Bloodstream Infections in the Intensive Care Unit. Critical Care Medicine. 2004;32:2014–20.[PubMed]
  • Berman S. An Interview with Peter Pronovost. Joint Commission Journal on Quality and Safety. 2004;30:659–64.[PubMed]
  • Clancy CM. Training Health Care Professionals for Patient Safety. American Journal of Medical Quality. 2005;20:277–79.[PubMed]
  • Cohen M, Eustis MA, Gribbins RE. Changing the Culture of Patient Safety: Leadership's Role in Health Care Quality Improvement. Joint Commission Journal on Quality and Safety. 2003;29:329–35.[PubMed]
  • Comden SC, Rosenthal J. Statewide Patient Safety Coalitions: A Status Report. Portland, Maine: National Academy for State Health Policy; 2002.
  • DeFontes J, Surbida S. Preoperative Safety Briefing Project. Permanente Journal. 2004;8(2):21–27.
  • Dellinger EP, Hausmann SM, Bratzler DW, Johnson RM, Daniel DM, Bunt KM, Baumgardner GA, Sugarman JR. Hospitals Collaborate to Decrease Surgical Site Infections. American Journal of Surgery. 2005;190:9–15.[PubMed]
  • DeRosier J, Stalhandske E, Bagian JP, Nudell T. Using Health Care Failure Mode and Effect Analysis. Joint Commission Journal on Quality and Safety. 2002;28:248–67.[PubMed]
  • Devers KJ, Pham HH, Liu G. What is Driving Hospitals' Patient-Safety Efforts? Health Affairs. 2004;23(2):103–15.[PubMed]
  • Dudley RA, Frolich A, Robinowitz DL, Talavera JA, Broadhead P, Luft HS. Technical Review. Rockville, Md: Agency for Healthcare Research and Quality; 2004. Strategies to Support Quality-Based Purchasing: A Review of the Evidence. 10. [PubMed]
  • Frankel A, Graydon-Baker E, Neppl C, Simmonds T, Gustafson M, Gandhi TK. Patient Safety Leadership Walkrounds. Joint Commission Journal on Quality and Safety. 2003;29:16–26.[PubMed]
  • Gallagher TH, Waterman AD, Ebers AG, Fraser VJ, Levinson W. Patients' and Physicians' Attitudes Regarding the Disclosure of Medical Errors. Journal of the American Medical Association. 2003;289:1001–7.[PubMed]
  • Galvin RS, Delbanco S, Milstein A, Belden G. Has the Leapfrog Group Had an Impact on the Health Care Market? Health Affairs. 2005;24(1):228–33.[PubMed]
  • Gosfield AG, Reinertsen JL. The 100,000 Lives Campaign: Crystallizing Standards of Care for Hospitals. Health Affairs. 2005;24:1560–70.[PubMed]
  • Haig K. One Hospital's Journey toward Patient Safety—A Cultural Evolution. Medscape Money & Medicine. 2003;4 August 29, Available at http://www.medscape.com/viewarticle/460721.
  • Heget JR, Bagian JP, Lee CZ, Gosbee JW. John M. Eisenberg Patient Safety Awards. System Innovation: Veterans Health Administration National Center for Patient Safety. Joint Commission Journal on Quality and Safety. 2002;28:660–65.[PubMed]
  • Helmreich RL. On Error Management: Lessons from Aviation. British Medical Journal. 2000;320:781–85.[PMC free article][PubMed]
  • Helmreich RL, Merritt AC. Culture at Work in Aviation and Medicine: National, Organizational, and Professional Influences. Burlington, Vt: Ashgate; 1998.
  • Hibbard JH, Peters E, Slovic P, Tusler M. Can Patients Be Part of the Solution? Views on Their Role in Preventing Medical Errors. Medical Care Research and Review. 2005;62:601–16.[PubMed]
  • Hibbard JH, Stockard J, Tusler M. Hospital Performance Reports: Impact on Quality, Market Share, and Reputation. Health Affairs. 2005;24:1150–60.[PubMed]
  • Hudson P. Applying the Lessons of High Risk Industries to Health Care. Quality and Safety in Health Care. 2003;12:7–12.[PMC free article][PubMed]
  • Infectious Disease Society of America (IDSA) States Publicize Hospital Data on Infection Rates. IDSA News (September 1), Available at http://www.idsociety.org.
  • Institute for Healthcare Improvement (IHI) Rapid Response Teams: Heading off Medical Crises at Baptist Memorial Hospital–Memphis. Boston: 2004a.
  • Institute for Healthcare Improvement (IHI) Trigger Tool for Measuring Adverse Drug Events. Boston: 2004b.
  • Institute for Healthcare Improvement (IHI) Failure Modes and Effects Analysis Tool. Boston: 2005a.
  • Institute for Healthcare Improvement (IHI) 100K Lives Campaign Overview. Boston: 2005b.
  • Institute of Medicine (IOM) To Err Is Human: Building a Safer Health Care System. Washington, D.C.: National Academy Press; 1999.
  • Institute of Medicine (IOM) Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, D.C.: National Academy Press; 2001.
  • International Atomic Energy Agency (IAEA) Safety Culture in Nuclear Installations: Guidance for Use in the Enhancement of Safety Culture. Vienna: 2002.
  • Knox GE, Simpson KR, Garite TJ. High Reliability Perinatal Units: An Approach to the Prevention of Patient Injury and Malpractice Claims. Journal of Healthcare Risk Management. 1999;19(2):24–32.[PubMed]
  • Krause TR, Seymour KJ, Sloat KCM. Long-Term Evaluation of a Behavior-Based Method for Improving Safety Performance. Safety Science. 1999;32:1–18.
  • Leape L. Reporting of Adverse Events 2002. New England Journal of Medicine. 2002;347:1633–38.
Executive Summary

In 1999, the Institute of Medicine (IOM) shocked the nation with its estimate that 48,000 to 98,000 Americans die each year in the hospital—not from the illnesses or injuries for which they sought treatment, but because of mistakes and oversights in medical care. Although the numbers cited by the IOM were eye catching, the statistics were compelling because they captured a troubling idea: people can be harmed by care meant to help them.

The title of the IOM's report, To Err Is Human: Building a Safer Health Care System, emphasizes the fact that humans are prone to error. Therefore, the report contends, keeping patients safe from harm cannot depend on human perfection. Industries such as aviation and nuclear power achieve highly safe operations by taking human fallibility into account when people are trained, systems are designed, and organizations are managed.

The IOM report raised public and professional awareness of the need for change and galvanized positive action from many parts of the health care system. Before its publication, experts and leaders were working to educate others about the problem and to discover and demonstrate practical means for improvement. In more recent years, these efforts have intensified and more organizations and individuals have joined the cause. Nevertheless, experts agree that far more needs to be done to realize the vision of the IOM report.

This report, commissioned on the fifth anniversary of To Err Is Human, illustrates innovations in five areas that hold great promise for improving patient safety if applied nationally:

  • promoting an organizational culture of safety,
  • improving teamwork and communication to promote patient safety,
  • enhancing rapid response to prevent heart attacks and other crises in the hospital,
  • preventing health care-associated infections in the intensive care unit, and
  • preventing adverse drug events throughout the hospital.

Using 10 case studies, this report describes the actions taken and lessons learned from organizations, teams, and collaborations, with suggestions for those seeking to replicate these successes. The organizations studied range from large integrated delivery systems to small community hospitals. Some have been recognized as leaders, while others are lesser-known innovators. Likewise, some of these efforts are now mature, while others are showing great promise. Overall, the cases demonstrate that improvement can occur in any organization where there is leadership, purposeful application of methods, and the will to change for the sake of patient safety.

One overriding lesson emerges: the programs identified organizational cultural change—the creation of a "patient safety culture"—as the critical element in making patients safer. Organizations seemed to differ chiefly in the methods used to instill a safety culture. The first two case studies examine organizations that have undertaken wide-ranging organizational change strategies, while the others describe more specific approaches. Regardless of method, the goal is a safety culture that promotes continuing innovation and improvement.

The organizations and teams studied for this paper have reported impressive improvements in patient safety and related organizational performance, including:

  • substantial reductions in observed adverse events and certain hospital-acquired infections;
  • apparent elimination of serious errors such as reported wrong-site surgeries;
  • reduction in death rates;
  • improvement in safety attitudes, teamwork, and communications behaviors associated with improved safety performance;
  • increased reporting of safety incidents and more effective investigations into their causes, leading to more useful recommendations for changes to prevent recurrence;
  • enhanced nursing morale and decreased nursing turnover and vacancies;
  • more efficient use of staff time by eliminating repeated work and delays; and
  • cost-savings resulting from shorter hospital stays.

Many of the cases demonstrate a correlation between patient safety and other domains of quality improvement. For example, collaborative rounds improved awareness of safety issues as well as patient- and family-centeredness of care and clinical outcomes. Interventions to improve safety through teamwork and communication improved staff morale and reduced nursing turnover. Empowering staff with improvement tools and knowledge is likely to have spillover effects on other domains of quality. The individuals, teams, and organizations studied were frequently engaged in other types of quality improvement activities or were considering ways to apply their learning to additional areas. These signs of collateral benefits and connections suggest that patient safety should not be approached as an isolated domain of quality.

The organizations profiled here would readily admit to being on only the first leg of the journey. Much more work is needed to achieve a truly safe environment for patients. These experiences, therefore, should be seen as a source of inspiration and encouragement to achieving even higher levels of performance. Although the cases focused on hospital settings, the lessons learned here are potentially applicable to ambulatory care environments, where patients receive most of their health care.

Table ES-1. Summary of Case Study Sites, Interventions, and Results

OrganizationSettingInterventionSelected Results
Sentara Norfolk General Hospital, Norfolk, Va.A 569-bed, level 1 trauma center; one of six hospitals operated by Sentara Healthcare, a regional integrated health care delivery networkAccelerate patient safety improvement through a multifaceted culture change program involving setting and monitoring behavioral expectations, enhancing analytic capabilities, and streamlining and focusing on critical policies
  • 42% increase in expected communications behaviors
  • 50% reduction in events of harm per 10,000 adjusted patient days when culture change strategies were applied system-wide
U.S. Dept. of Veterans Affairs, National Center of Patient Safety, Ann Arbor, Mich.An integrated health care system that serves 5.1 million veterans and 7.6 million enrollees at more that 1,300 sites nationwideLead organizational cultural change by empowering local facilities and frontline staff with proven tools, methods, and initiatives for patient safety improvement
  • 30-fold increase in internal safety incident reporting
  • 100% increase in perceived preventability of safety events studied by root cause analysis teams
Kaiser Permanente, Orange County, Calif., and Northern California regionLocal medical centers of an integrated group-model health maintenance organization with 8.2 million people enrolled nationallyInitiate a preoperative safety briefing and a perinatal patient safety project as part of a program of organizational learning to promote effective teamwork and communication in high-risk areas
  • A near doubling in the proportion of operating room staff reporting positive teamwork climate
  • Two-thirds reduction in the turnover rate among operating room nursing staff
Concord Hospital, Concord, N.H.A 295-bed community hospital that annually treats 250 patients in its cardiac surgery programUse collaborative rounds involving all members of the care team with the patient and patient's family to proactively identity and prevent potential errors and safety threats
  • 56% lower than expected risk-adjusted mortality among cardiac surgery patients
  • 15% to 32% higher staff ratings of teamwork and work satisfaction compared to traditional rounds
Missouri Baptist Medical Center, St. Louis, Mo.A 489-bed acute care hosptial within BJC HealthCare, a health system comprising 13 hospitals and other facilitiesEstablish a rapid response team to intervene early with patients showing signs of medical deterioration before the suffer acute crises
  • 60% decrease in emergency calls for respiratory arrest
  • 15% decline in cardiac arrests
  • 3.95% reduction in hospital mortality rate
Johns Hopkins Hospital, Baltimore, Md.A 14-bed oncology surgical ICU and a 15-bed surgical ICU within a 900-bed academic medical center; one of three acute care hospitals in the Johns Hopkins Health SystemImplement a comprehensive unit-based safety program that empowers staff to identify and eliminate patient safety hazards following eight action steps
  • 49% to 91% increase in the proportion of ICU staff reporting positive safety climate
  • Elimination of 43 observed catheter-related bloodstream infections, saving eight lives
  • One-day reduction in average ICU length of stay, saving an estimated $2 million annually
VHA, Inc., Transformation of the Intensive Care Unit National CollaborativeMore than 40 ICUs in diverse community hospitals nationwide that are members of VHA, a health care cooperative serving not-for-profit health care organizationsFocus all members of the care team on adhering to a "bundle" of evidence-based care practices associated with improved patient outcomes
  • 29% to 41% reduction in combined rates of ventilator-associated pneumonias
  • 11% to 15% decrease in average lengths of stay across participating ICUs
  • 18% lower mortality
Allegheny General Hospital, Pittsburgh, Pa.A medical ICU and a cardiac care ICU in an 829-bed academic health center, part of six-hospital West Penn Allegheny Health SystemApply the Perfecting Patient Care approach, modeled on principles of the Toyota Production System, to specify best practices, eliminate variations from standards, and work toward ideal performance
  • 76% reduction in rate of central-line associated bloodstream infections, saving 18 lives per year
  • $2 million savings by reducing unreimbursed costs of care
Institute for Healthcare Improvement, Cambridge, Mass. and Premier, Inc., San Diego, Calif.A nonprofit organization that works with health care institutions worldwide to spread quality improvement, and an alliance of more than 200 not-for-profit hospitals and health systemsDevelop a simple trigger tool that organizations can use to measure the incidence and kinds of adverse events, so as to prioritize areas for improvement, design appropriate interventions, and track the effect of changes over time
  • 50-fold increase in detection of adverse drug events as compared to other common methodologies such as incident reports, pharmacy interventions, or billing codes
OSF St. Joseph Medical Center, Bloomington, Ill.A 165-bed acute care hospital, part of OSF HealthCare, a six-hospital integrated health care network based in Peoria, Ill.Reduce adverse drug events by improving the process of medication reconciliation, the safe use of high-risk medications, and the reliability of medication dispensing
  • 10-fold reduction in detected adverse drug events
  • 8% improvement in perceived safety culture among hospital staff
Note: ICU = intensive care unit.
Promoting High Reliability Surgery and Perinatal Care Through Improved Teamwork and Communication at Kaiser Permanente
Categories: 1

0 Replies to “Patient Safety Case Study Examples”

Leave a comment

L'indirizzo email non verrà pubblicato. I campi obbligatori sono contrassegnati *