Collaborative Approaches to Evaluation. Группа авторов
Чтение книги онлайн.
Читать онлайн книгу Collaborative Approaches to Evaluation - Группа авторов страница 2
![Collaborative Approaches to Evaluation - Группа авторов Collaborative Approaches to Evaluation - Группа авторов Evaluation in Practice Series](/cover_pre877716.jpg)
Technical tools are the research designs, measurement techniques, and analysis strategies that evaluators use to guide their evaluation practice decisions. Across the chapters, technical tools are used in two ways: the technical tools used to carry out the evaluation and the technical tools used to engage in RoE. For the chapters that are grounded in a specific evaluation, almost all (Chapter 3, Chapter 4, Chapter 5, Chapter 7) used a multimethod design, and one used a qualitative case study design (Chapter 6). When it came to the actual RoE studies, however, several methods were used. These methods included retrospective case study (Chapter 2, Chapter 5, Chapter 6), qualitative thematic analysis (Chapter 3, Chapter 9), Q-methodology (Chapter 7), meta-reflection (Chapter 4), and participatory action research (Chapter 8). Across these RoE studies, the authors have provided examples of how evaluators could engage in a systematic analysis of their work as a way to understand and describe the complex work of professional evaluation. Readers interested in this aspect of practice will find much value in the cross-chapter analysis in Chapter 10.
Practical tools are the strategies, interpersonal skills, practices, and moves evaluators use in their work as they strive to carry out an evaluation. Arguably, these are one of the hardest tools to see and teach across expert knowledge occupations. This is why, for example, doctors spend time developing bedside manners, psychotherapists spend time unpacking video recordings of themselves with patients, and teachers spend time critically reflecting on their in-the-moment instructional decisions. In all of these instances, you have to be in the right place at the right time to catch a glimpse of a professional using a particular practical tool, and at the same time, you have to have a systematic process in place for unpacking use of these tools. A strength of the chapters included in this volume is that they make visible some of the practical tools evaluation practitioners and educators use. For example, how evaluators go about building relationships and with whom, the importance of trust, and the communication strategies evaluators use that are aligned with CAE principles. Moreover, in Chapters 7 and 9, we see how novice and emergent evaluators engage with learning to identify and use these practical tools, as well as the practical tools evaluation educators use to foster learning.
While each of these tools is important, in practice, they are and must be intertwined. Learning about a theory or learning to justify a technical method is not the same thing as learning to use that theory in practice or to use a particular design in an evaluation, just as learning about the technical aspects of writing and learning about narratology1 is not learning to write. The latter requires that an author actually engage in writing, understanding, for example, how to use technical rules, when it makes sense to break rules, how potential readers understand or interpret what they are reading, how they react to prose and whether that is what you intended, and so forth. Because evaluation is situated in the social, learning how to evaluate requires all three tools. It is how one learns to use our evaluation theories, the benefits and limits of particular evaluation approaches, how to generate evidence that will be perceived as credible to a wide body of stakeholders, which dilemmas to anticipate and how to think through addressing unanticipated issues, and so forth.
1 The Oxford English Dictionary defines narratology as “the branch of knowledge or criticism that deals with the structure and function of narrative and its themes, conventions, and symbols” (https://en.oxforddictionaries.com/definition/narratology).
This volume has given us a window into our complex work, which will be useful for novice and seasoned evaluators learning how to use the conceptual, technical, and practical tools of our profession. It will also be useful for evaluation educators who are working to facilitate the process of learning to practice. Evaluation researchers who are interested in describing and understanding practice will also find much use in this volume.
Bianca Montrosse-Moorhead, Marvin C. Alkin, and Christina A. Christie Volume Editors
References
American Evaluation Association. (2017, August 28). AEA evaluator competencies. Retrieved from https://www.eval.org/p/do/sd/sid=8317&fid=2290&req=direct
Mathison, S. (2005). Preface. In S. Mathison (Ed.), Encyclopedia of evaluation (pp. xxxiii–xxxv). Thousand Oaks, CA: Sage.
Schwandt, T. A. (2015). Evaluation foundations revisited: Cultivating a life of the mind for practice. Stanford, CA: Stanford University Press.
Scriven, M. (1991). Evaluation thesaurus. Newbury Park, CA: Sage.
United Nations Evaluation Group. (2016). UNEG evaluation competency framework. Retrieved from http://www.unevaluation.org/2016-Evaluation-Competency-Framework
Preface
This edited volume arises from the initial work of the Collaborative Opportunities to Value Evaluation (COVE) research group2 on the development and validation of a set of principles to guide practice in collaborative approaches to evaluation (CAE)—evaluations that implicate evaluators working in partnership with program stakeholders to produce evaluative knowledge. Such approaches are on the rise. Several types (e.g., rapid rural appraisal, participatory action research) have been practiced in international development contexts for decades. But many others have been developed more recently (e.g., contribution analysis, developmental evaluation) and provide evaluation practitioners and commissioners with a range of options that depart quite significantly from traditional mainstream approaches to evaluation. At least partly in response to this growing family, the COVE research group committed to developing and validating a set of evidence-based principles to guide CAE practice. These principles were first published