Quality and Safety in Nursing. Группа авторов

Чтение книги онлайн.

Читать онлайн книгу Quality and Safety in Nursing - Группа авторов страница 19

Quality and Safety in Nursing - Группа авторов

Скачать книгу

performance by applying human factors, discussed more fully in Chapter 8. Safety science builds on Reason’s (2000) human error trajectory, which uses the model of lining up a stick through the holes of Swiss cheese; sometimes redundancies in the system fail, and all the holes line up, allowing errors to happen. Reason’s definition and classification of errors are in Textbox 1.3.

      An adverse event is an injury that results from care management and delivery, not from the underlying patient condition or the reason the patient was seeking care. Preventable adverse events are those attributed to any of the various types of error. Diagnostic errors delay diagnosis, prevent use of appropriate tests, or result in failure to act. Treatment errors can occur while administering treatment, such as errors in administering medication, often delaying treatment or contribute to inappropriate care. Other examples are failure to provide prophylactic treatment, inadequate monitoring or follow‐up, failure to communicate, equipment malfunction, or other system failure.

      These examples illustrate the inconsistent nomenclature of a long list of terms that make it difficult to consistently report similar events in a central system. Errors can be defined in multiple ways with varied components, making it challenging to replicate how the aviation industry aggregates reports of airline events.

Error Definition Characteristics
Error The failure of a planned action to be completed as intended or the use of an incorrect plan to achieve an aim
Error Classification
Error of execution The correct action does not proceed as intended
Error of planning The original intended action is not correct

      Safety Culture

      A non‐punitive approach to patient harm is built on the engagement and commitment of everyone, from the boardroom to all staff, to accountability, honesty, integrity, and mutual respect in a culture of safety (Dempsey and Assi, 2018). Accountability is a critical aspect of a culture of safety; recognizing and acknowledging one’s actions is a trademark of professional behavior. All staff are trained and empowered to participate in an error‐reporting system without fear of punitive action. Near misses are treated as opportunities to improve by examining gaps and correcting design flaws. Safety principles to eliminate hazards guide job design, management of equipment, and working conditions. Reis, Pavia, and Souza (2018) reported teamwork within units and organizational learning–continuous improvement were the strongest dimensions of safety culture, while non‐punitive response to error, staffing, handoffs and transition, and teamwork across units were weaker dimensions. Lin et al. (2017) reported a concept analysis of health care providers’ perceptions of safety climate with three attributes: (a) senior management creates a safe working environment; (b) health care providers share perceptions of safety in their work environment; and (c) effectively disseminating information about safety.

      Just Culture

      In a just culture, the focus is to determine what went wrong rather than identifying exactly who committed the error to establish blame and punishment (Armstrong, 2019). Just culture establishes an environment in which errors and near misses are acknowledged, reported, and analyzed for ways to improve the system. Organizations with a culture of safety have implemented processes through risk management to collect error reports for root cause analysis, often classifying with a tiered system of potential for harm. Carefully detailing all steps and decisions leading to an error or near miss can formulate a system redesign of processes that lessens the chance of future occurrence. The focus is on improving the system to prevent future errors rather than merely blaming individuals. Exploring what happened acknowledges the influence of complex systems and human factors that influence safety.

      A Systems Approach

      Systems are a set of interdependent components that interact to achieve a common goal (Dolansky et al., 2017; McNab et al., 2020). For example, a hospital is a system composed of service lines, nursing care units, ancillary care departments, outpatient care clinics—these are examples of microsystems of the larger system. The way in which these separate but united system components interact and work together is a significant factor in delivering high‐quality, safe care (Yakusheva et al., 2020). By crosswalking the six QSEN competencies with The Joint Commission safety standards and the Magnet standards (Lyle‐Edrosolo and Waxman, 2016), organizational leadership helps align quality and safety goals with mission and vision so that it is practiced consistently throughout all areas and levels of the system. Nurses and other health care professionals need to improve systems thinking skills to have an impact on patient care improvement; that is, thinking how one action impacts the next action. For instance, front‐line examples of systems thinking is the way nurses coordinate turning a patient every two hours or managing multiple ports for invasive procedures.

      Health care delivery has intersecting units or microsystems. How these systems function together impacts quality and safety outcomes. For instance, the way patients are assigned beds from the ED to one of the inpatient units, or how the lab responds in urgent situations to the need for blood draws, or how patients are discharged to a skilled nursing facility are opportunities to standardize operational procedures to improve effective outcomes.

      High‐Reliability Organizations

      High‐reliability organizations (HROs) focus on safety; being mindful is pervasive in the culture. Focusing on where the next error may occur allows providers to increase vigilance, establish check lists, or implement other preventions (see Chapter 8). Five principles guide HROs: sensitivity to operations, preoccupation with failure, reluctance to simplify, deference to expertise, and commitment to resilience. HROs apply a systems approach (Oster and Braaten, 2020), shifting error prevention from the individual to a shared accountability across the system. Understanding how the adverse event trajectory occurred provides the opportunity to reconsider protocols, procedures, or other actions that will reduce the possibility of a repeat error. To prevent harm to patients, organizations adopt operational systems and processes that minimize risk and focus on maximizing interception of errors before harm occurs (Sherwood and Armstrong, 2020).

      Simplifying and standardizing processes for reliable results are key components of HROs. Reliability is expecting to get the same result each time an action occurs; therefore, a reliable system seeks to have defect‐free operations in spite of a high‐risk environment such as preventing wrong‐site surgery or health care–acquired infections. For example, the reduction in CLABSI (central line arterial blood stream infection) events came from following an evidence‐based standardized care process.

      Reliability has economic consequences. Hospital reimbursement is increasingly tied to quality and safety outcomes (see Chapter 2). Hospitals may not be reimbursed for patient harms such as hospital‐acquired infections, therefore reliable procedures are needed to ensure adherence to hand‐washing procedures, evidence‐based

Скачать книгу