Kelly Vana's Nursing Leadership and Management. Группа авторов
Чтение книги онлайн.
Читать онлайн книгу Kelly Vana's Nursing Leadership and Management - Группа авторов страница 88
TJC is an independent, not‐for‐profit group in the United States that provides accreditation for many hospitals and other health care organizations. TJC publishes a list of the most frequently occurring sentinel events, summarized in Table 4.1. The reporting of sentinel events is voluntary and represents only a small proportion of actual sentinel events.
Table 4.1 Five Most Frequent Sentinel Events, 2014 through 2017 (TJC, January 2018)
Event | Definition | Number of Reported Events |
---|---|---|
Unintended Retention of a Foreign Body | An unintended retention of a foreign body is defined as the retention of a foreign object in a patient after surgery or other procedure. | 481 |
Wrong‐patient, wrong‐site, wrong‐procedure | A wrong‐patient procedure occurs when the procedure is performed on the incorrect patient. A wrong‐site procedure involves operating on the wrong side of a patient. A wrong procedure occurs when an incorrect procedure is performed on a patient. | 409 |
Fall | A fall sentinel event is defined as an unplanned descent to the floor resulting in death, permanent harm, or severe temporary harm. | 404 |
Suicide | A suicide sentinel event is defined as intentionally killing oneself in a healthcare setting. | 361 |
Delay in treatment | A delay in treatment sentinel event is when a patient does not get any type of treatment (medication, lab test, etc.) that was ordered for them in the time frame in which it was supposed to be delivered, and the delay results in death, permanent harm, or severe temporary harm. | 290 |
Total number of reported sentinel events | A subcategory of Adverse Events, a Sentinel Event is a patient safety event (not primarily related to the natural course of the patient's illness or underlying condition) that reaches a patient and results in any of the following: Death; Permanent harm; or Severe temporary harm. | 3,326 |
Source: The Joint Commission (January 2018). Patient Safety Systems. Retrieved from https://www.jointcommission.org/assets/1/6/PS_chapter_HAP_2018.pdf.
The Anatomy of an Error
Active Errors
Errors are often noticed at the point of care, where nurses and other clinicians interact with patients. This is considered the “sharp end” of a triangle (Cook & Woods, 1994). These errors are considered active errors because they occur at the point of interface between humans and a complex system. Clinicians often blame themselves when errors happen. However, their work with patients is influenced by many factors and decisions made before an actual error occurs. These factors and decisions are considered the “blunt end” of the triangle.
Latent Errors
Latent errors are hidden problems within health care systems that contribute to adverse events (Agency for Healthcare Research and Quality [AHRQ], 2018a, 2018b). For example, policies and procedures within an organization may be inaccessible, difficult to understand, or inaccurate. Work processes might be confusing and patient handoffs may be rushed and inadequate. The environment may be cramped and noisy, making it difficult to concentrate. Technology may fail or be cumbersome to use. Individuals may blame others rather than taking personal responsibility. The culture of the hospital might hinder a nurse's ability to speak up about safety concerns. All of these “blunt end” factors may contribute to an error at the “sharp end,” where clinicians interact with the patient.
James Reason (1997) proposed the Swiss Cheese model, shown in Figure 4.1, to illustrate how errors occur. The model suggests that every step in a process has the potential for error. The holes in the Swiss cheese represent opportunities for a process to fail, and each slice is a defensive layer to prevent an error in the process. An error may pass through a hole in one layer but in the next layer, the hole is in a different spot and the layer catches the error before it reaches the patient. More layers of cheese and smaller holes allow more errors to be stopped or caught. When the holes in the Swiss cheese line up, an error occurs.
Source: © Used with permission granted from Patti Ludwig‐Beymer.
High Reliability Organizations
Building an HRO is a cultural transformation designed to ensure safe practices and reduce errors and sentinel events in health care. Health care is complex and involves the risk of significant and potentially catastrophic consequences when failures occur. HROs operate under these trying conditions and have fewer accidents. They have the ability to provide consistent health care at a high level of excellence over a long period of time. HROs in health care establish and maintain high quality and safety expectations for patient care. Quality and safety error rates are near zero in HROs (Weick & Sutcliffe, 2007).
The risk of health care error is a function of both probability and consequence. For example, consider the care needed for a dehydrated patient with renal failure. Administering fluids too slowly can result in prolonged hypotension. Administering fluids too rapidly can result in fluid retention and heart failure. An IV pump is used to assist the nurse in providing accurate amounts of fluid. The IV pump decreases the probability of error. However, if the pump is programmed incorrectly or fails completely, the consequences can be catastrophic. By decreasing the probability of an error, HROs operate to make health care systems safer.
Origins of HRO
HROs operate in complex, high‐hazard situations for extended periods without serious accidents or catastrophic failures. HROs relentlessly prioritize safety over other performance pressures. An example is a military aircraft carrier. The carrier operates under significant production pressures with aircrafts taking off and landing every 48–60 seconds; constantly changing conditions; and a hierarchical (military) organizational structure. However, personnel consistently prioritize safety and have both the authority and the responsibility to make real‐time operational adjustments to maintain safe operations as the top priority (AHRQ, 2018a, 2018b).
In the 1970s, research conducted by the National Aeronautics and Space Administration suggested that most commercial airplane crashes were caused by communication failures among pilots and crew, not by mechanical failures. In some cases, co‐pilots were aware that pilots were making unsafe decisions but did not verbalize their concerns because of authority gradient.