A Handbook for High Reliability Schools. Robert J. Marzano

Чтение книги онлайн.

Читать онлайн книгу A Handbook for High Reliability Schools - Robert J. Marzano страница 7

A Handbook for High Reliability Schools - Robert J. Marzano

Скачать книгу

the school environment as safe and orderly (a leading indicator) might formulate the following lagging indicator to measure their progress toward a safe and orderly environment: “Few, if any, incidents occur in which rules and procedures are not followed.” To meet this lagging indicator, school leaders would have to determine how many incidents constitute a “few.” This number is called a criterion score; it is the score a school is aiming to achieve for the lagging indicator. School leaders then track the actual number of incidents occurring in the school and compare the number of incidents to the criterion score. If the results meet the criterion score, the school considers itself to have met that lagging indicator and the evidence can be used to validate the school’s achievement of a specific high reliability level. If the results do not meet the criterion score, the school continues or adjusts its efforts until it does meet the score.

      To design lagging indicators and criterion scores, school leaders can use several different approaches. The first is a percentage approach wherein school leaders create a lagging indicator that states a certain percentage of responses or data collected will meet a specific criterion. For example, a percentage lagging indicator for level 1 might be “Ninety percent of survey responses will indicate agreement that the school is safe and orderly.” School leaders can use a sentence stem such as “ ________________ percent of responses or data will “ ________________ to formulate percentage lagging indicators.

      A second approach involves setting a cutoff score, below which no responses or data will fall. The following is a possible cutoff lagging indicator for level 2: “No teachers will improve less than two levels on the scale for each of their growth goals each year.” School leaders could use a sentence stem such as “No responses or data will fall below “ ________________ to compose cutoff lagging indicators.

      In cases where a school has received fairly high initial survey responses but still wants to improve, school leaders can choose to set lagging indicators for specific amounts of growth. A growth lagging indicator for level 3 might say, “Survey responses regarding all students having adequate opportunity to learn will improve 10 percent.” An appropriate sentence stem for growth lagging indicators would be “Responses or data will be ________________ percent higher than original responses or data.”

      Finally, lagging indicators can be designed around the creation of a concrete product as evidence of high levels of performance. A concrete product lagging indicator for level 4 might say, “Written goals are available for each student in terms of his or her performance on common assessments.” School leaders could use a sentence stem such as “A document or report stating ________________ exists” to design concrete product lagging indicators.

      The following chapters list leading indicators for each level. Lagging indicators, however, must be formulated for each specific school by its leaders. Schools should identify lagging indicators and set criterion scores that are appropriate to their unique situation and needs. In each chapter, we provide a template leaders can use to formulate lagging indicators and set criterion scores for each level.

      After creating lagging indicators for a level, school leaders implement specific activities or initiatives that help them meet the goals inherent in the lagging indicators. For example, if a school’s lagging indicator states that they will average no more than one incident per month in which rules or procedures are not followed, and they currently average five such incidents per month, they must implement activities or initiatives that change the current state of the school.

      We refer to the suggested activities or initiatives that school leaders implement to meet their lagging indicators as critical commitments. It is important to note that these commitments are based on the cumulative experience of practitioners and researchers at Marzano Resources and the research and development work of Robert J. Marzano. Therefore, the critical commitments identified in this book should be considered as strong suggestions. Certainly a school can reach high reliability status for a given level without implementing these suggestions; however, years of experience have established these activities as very useful to achieving high reliability status for a given level. Critical commitments within each level are shown in table I.3.

Table I.3: HRS Critical Commitments
Level 5 Get rid of time requirements. Adjust reporting systems accordingly.
Level 4 Develop proficiency scales for the essential content. Report status and growth on the report card using proficiency scales.
Level 3 Continually monitor the viability of the curriculum. Create a comprehensive vocabulary program. Use direct instruction in knowledge application and metacognitive skills.
Level 2 Create an evaluation system whose primary purpose is teacher development: • The system is comprehensive and specific. • The system includes a developmental scale. • The system acknowledges and supports growth.
Level 1 Implement the professional learning community (PLC) process.

      The critical commitments for each level are described in depth in the following chapters. We believe they are essential to achieving high reliability status.

      Once a school has met the criterion scores for a level’s lagging indicators, it is considered to have achieved high reliability status for that level. However, being a high reliability school at a given level involves more than meeting criterion scores for lagging indicators. Recall from the previous discussion of high reliability organizations that implementing processes and procedures to prevent problems is only part of what they do. High reliability organizations also constantly monitor critical factors, looking for changes in data that indicate the presence of problems.

      Similarly, high reliability schools monitor critical factors and immediately take action to contain and resolve the negative effects of problems as quickly as possible. Even after a school has achieved high reliability status for a specific level, its leaders continue to collect and analyze data related to leading and lagging indicators to ensure that the expectations of that level are continuously met over time. In the event that data for a specific indicator cease to meet expectations, school leaders intervene to identify the problem, minimize any negative effects of the problem, and either strengthen existing processes and procedures or implement new ones to fix the current problem and prevent future ones.

      Constantly monitoring critical factors for problems requires continual data collection and observation. Consider an organization with very little tolerance for errors: the U.S. Navy. Particularly, consider an aircraft carrier, a ship from which fighter jets, helicopters, and other aircraft take off and land. The number of potential errors in such an environment is mind-boggling. For example, even small debris—like a pebble or scrap of cloth—on the flight deck can cause catastrophic problems for the finely tuned engines and other sensitive systems of naval aircraft. Therefore, the U.S. Navy implements systematic FOD walks. FOD stands for “foreign objects and debris,” and during a FOD walk, personnel on the aircraft carrier walk along the deck shoulder to shoulder, picking up anything that they find. Such procedures occur multiple times each day. Figure I.1 shows a FOD walk being conducted on board an aircraft

Скачать книгу