Using Predictive Analytics to Improve Healthcare Outcomes. Группа авторов
Чтение книги онлайн.
Читать онлайн книгу Using Predictive Analytics to Improve Healthcare Outcomes - Группа авторов страница 22
Using data without screening for error first is like putting a ship to sail before checking if the ship has holes in it.
If the data is not checked for accuracy, anxiety and frustration build for staff members and leaders alike as the organization’s inconsistent data is then reported to Leapfrog, Hospital Compare, and the Magnet® Recognition program. The experience of anxiety for staff members and leaders is tied to the reporting of this data to regulatory and accreditation organizations, because these organizations report the hospital‐level data to the public who then make choices about what hospital they will go to. The frustration is tied to the ongoing struggle with trying to understand the fluctuation in scores. This fluctuation in scores, which is also frustrating to manage, suggests that other variables affecting the outcomes have not been measured or were not measured correctly. Given this pattern of collecting, distributing, and acting on flawed data, with no sustainable improvement toward the goals of high reliability, patient safety, and full reimbursement opportunities, the question is: How can we get our hands on data that includes everyone on all units and points us toward the precise actions we can take to improve operations?
Taking on the Challenge
In our organization, the first step in getting out of the problematic data cycle was realizing that the quality committee members were clinicians and business administrators, not data analysts who can translate all those numbers into valid, reliable, widely consumable information. It had become clear that the money we were spending on the work of the quality committee was like spending three dollars on a bottle of wine, and then expecting to have a substantive conversation about the virtues of a truly fine wine. If we wanted to go beyond shuffling papers full of data to having a data process that truly informs, we needed to invest time and resources into experts in this field. It was clear that a more systematic and scientific data management structure and equally systematic and scientific processes were needed if our organization was to positively impact patient care, staff satisfaction, and financial outcomes. To develop the appropriate structures and processes necessary to maximize the reliability, validity, relevance, and consumability of the current data, executive leaders hired a data analyst trained in research methods, measurement models, and predictive analytics to assist the team. It should be noted that this PhD prepared data analyst had been a bedside nurse for 11 years and was familiar with our framework of care, Relationship‐Based Care® (RBC) (Creative Health Care Management, 2017; Koloroutis, 2004), so he understood our context very well. We felt as if we had struck gold.
In our initial and subsequent Magnet® journeys, RBC had been chosen as the care delivery model by the organization because it aligned so thoroughly with our mission and vision. Through our work implementing RBC, we quickly discovered that staff members who have clarity about who they are, what their role is, and how the healthcare system works, were demonstrating better and more sustainable outcomes (Hozak & Brennan, 2012; Nelson & Felgen, 2015; Nelson, Nichols, & Wahl, 2017). Showing evidence of this rapid improvement, however, required a different approach, since “clarity of self, role, and system” were very unusual things to measure. It required administrators to trust that even though measuring caring and clarity did not directly measure costs or outcomes, a model that improves caring and clarity does make a financial impact in the long run, and it was therefore worth the time and investment to study these variables.
A model that improves caring and clarity does make a financial impact in the long run.
The quality committee partnered with the data analyst to measure clarity of self, role, and system, as we learned there was already a measurement instrument developed that could show us what positive outcomes clarity predicts (Felgen & Nelson, 2016). We used predictive analytics to study how clarity related to nurse job satisfaction, and how this impacted (a) caring as reported by the patient, (b) caring as reported by the care providers, (c) sleep quality of the patients, and (d) HCAHPS scores. Through this work, we discovered that our hunches about what we were implementing in RBC were right: increased clarity resulted in increased job satisfaction which in turn improved caring as reported by both patients and care providers (Nelson & Felgen, 2015). We also found that staff members who had higher job satisfaction also had patients who reported higher HCAHPS scores and better sleep during their hospital stay. We did strike gold!
Through our use of predictive analytics, we measured what was long thought unmeasurable, as we demonstrated that the concepts taught in RBC did matter.
Our study used predictive analytics to answer the questions and question the assumptions of staff members and leaders. The data provided at every step in the process became the vehicle for us to go deeper into conversations about the everyday operations of the units. We now had predictive analytics to help us discover some things that would allow us to refine operations with an extremely high likelihood that our efforts would be successful. The aspects of care we investigated further, using the data, addressed:
The data provided at every step in the process became the vehicle for us to go deeper into conversations about the everyday operations of the units.
1 The quality of relationships on the unit,
2 How staff members assessed their own care delivery and that of others,
3 Whether there was consistency in our care delivery,
4 Whether the mission and values of the organization were visible and tangible in our care delivery, and
5 How well the hospital’s operational systems worked for delivering care to patients and families.
While these were nontraditional things to measure, we had confidence that by understanding them better, we could understand more clearly what predicted other outcomes. These variables were all front‐of‐mind because of our work with Relationship‐Based Care. And because these variables came directly from the people closest to the work, the results of this study described the actual behaviors of staff members and leaders, providing highly relevant, actionable information and insights regarding the care environment.
The data analyst guided the staff members and leaders to discover the information behind the data by learning to ask questions starting with “how” and “why.” The time we spent together further refining the questions, digging deeper, and examining hunches to answer these questions allowed for truly helpful stories to emerge from staff members and leaders about the variables we were considering. These stories began to reveal possible predictor variables related to all of the variables of interest we sought to change.
One example of a variable that is always of interest is fall rates. Traditionally, in our organization, when the falls data for a unit was trended over quarters, action plans for the unit have usually included reeducating the staff on two of our current fall reduction strategies—use of the “falling star” program and yellow socks—as well as consistently asking whether the patient's call bell was within reach. Admittedly, some very helpful action plans were created based on these inquiries. However, the data is now also looked at in terms of meaningful questions such as “Why do falls happen more often in the bathrooms at night?” “How is rounding conducted at night?” and “How do staff members interact with the patient during rounding?” When decisions about what to measure are made using a process that includes discussion of a wide and relevant array of real‐world influences, your subsequent measurements will produce results that allow people to improve operations and design safer care delivery processes. These discussions, which included people from