You are here

RealWorld Evaluation
Share

RealWorld Evaluation
Working Under Budget, Time, Data, and Political Constraints

Third Edition


August 2019 | 568 pages | SAGE Publications, Inc
This book addresses the challenges of conducting program evaluations in real-world contexts where evaluators and their clients face budget and time constraints and where critical data may be missing. The book is organized around a seven-step model developed by the authors, which has been tested and refined in workshops and in practice. Vignettes and case studies—representing evaluations from a variety of geographic regions and sectors—demonstrate adaptive possibilities for small projects with budgets of a few thousand dollars to large-scale, long-term evaluations of complex programs. The text incorporates quantitative, qualitative, and mixed-method designs, and this Third Edition reflects important developments in the field since the publication of the Second Edition, including innovative new chapters on gender issues, and working with big data.
 
Preface
 
PART I: THE SEVEN STEPS OF THE REALWORLD EVALUATION APPROACH
 
1. Overview: RealWorld Evaluation and the Contexts in Which it is Used
Welcome to the RealWorld Evaluation  
The RealWorld Evaluation Context  
The Four Types of Constraints Addressed by the RealWorld Approach  
Additional Organizational and Administrative Challenges  
The RealWorld Approach to Evaluation Challenges  
Who Uses RealWorld Evaluation, for What Purposes, and When?  
Summary  
Further Reading  
 
2. First Clarify the Purpose: Scoping the Evaluation
Stakeholder Expectations of Impact Evaluations  
Understanding  
Developing the Program Theory Model  
Identifying the Constraints to Be Addressed by RWE and Determining the Appropriate Evaluation Design  
Developing Designs Suitable for RealWorld Evaluation Conditions  
Summary  
Further Reading  
 
3. Not Enough Money: Addressing Budget Constraints
Simplifying the Evaluation Design  
Clarifying Client Information Needs  
Using Existing Data  
Reducing Costs by Reducing Sample Size  
Reducing Costs of data Collection and Analysis  
Assessing the feasibility and utility of using new information technology (NIT) to reduce the costs of data collection  
Threats to Validity of Budget Constraints  
Summary  
Further Reading  
 
4. Not Enough Time: Addressing Scheduling and Other Time Constraints
Similarities and Differences Between Time and Budget Constraints  
Simplifying the Evaluation Design  
Clarifying Client Information Needs and Deadlines  
Using Existing Documentary Data  
Reducing Sample Size  
Rapid Data-Collection Methods  
Reducing Time Pressure on Outside Consultants  
Hiring More Resource People  
Building Outcome Indicators into Project Records  
New Information Technology for Data-Collection and Analysis  
Common Threats Adequacy and Validity Relating to Time Constraints  
Summary  
Further Reading  
 
5. Critical Information is Missing or Difficult to Collect: Addressing Data Constraints
Data Issues Facing RealWorld Evaluators  
Reconstructing Baseline Data  
Special Issues Reconstructing Baseline Data for Project Populations and Comparison groups  
Collecting Data on Sensitive Topics or From Difficult to Reach Groups  
Summary  
Further Reading  
 
6. Political Constraints
Values, Ethics, and Politics  
Societal Politics and Evaluation  
Stakeholder Politics  
Professional Politics  
Political Issues in the Design Phase  
Political Issues in the Conduct of an Evaluation  
Political Issues in Evaluation Reporting and Use  
Advocacy  
Summary  
Further Reading  
 
7. Strengthening the Evaluation Design and the Validity of the Conclusions
Validity in Evaluation  
Factors Affecting Adequacy and Validity  
A Framework for Assessing the Validity and Adequacy of QUANT, QUAL, and Mixed-Methods Designs  
Assessing and Addressing Threats to Validity for Quantitative Impact Evaluations  
Assessing Adequacy and Validity for Qualitative Impact Evaluations  
Assessing Validity for Mixed-Method (MM) Evaluations  
Using the Threats-to-Validity Worksheets  
Summary  
Further Reading  
 
8. Making it Useful: Helping Clients and Other Stakeholders Utilize the Evaluation
What Do We Mean by Influential Evaluations and Useful Evaluations?  
The Underutilization of Evaluation Studies  
Strategies for Promoting the Utilization of Evaluation Findings and Recommendations  
Summary  
Further Reading  
 
PART II: A REVIEW OF EVALUATION METHODS AND APPROACHES AND THEIR APPLICATION IN REALWORLD EVALUATION: FOR THOSE WHO WOULD LIKE TO DIG DEEPER
 
9. Standards and Ethics
Standards of Competence  
Professional Standards  
Ethical Codes of Conduct  
Issues  
Summary  
Further Reading  
 
10. Theory-Based Evaluation and Theory of Change
Theory-Based Evaluation [TBE] and Theory of Change [TOC]  
Applications of program theory evaluation  
Using TOC in program evaluation  
Designing a Theory of Change Evaluation Framework  
Integrating a theory of change into the program management, monitoring and evaluation cycle  
Program Theory Evaluation and Causality  
Summary  
Further Reading  
 
11. Evaluation Designs
Different Approaches to the Classification of Evaluation Designs  
Assessing Causality Attribution and Contribution  
The RWE Approach to the Selection of the Appropriate Impact Evaluation Design  
Tools and Techniques for Strengthening the Basic Evaluation Designs  
Selecting the Best Design for Real-World Evaluation Scenarios  
Summary  
Further Reading  
 
12. Quantitative Evaluation Methods
Quantitative Evaluation Methodologies  
Experimental and Quasi-Experimental Designs  
Strengths and Weaknesses of Quantitative Evaluation Methodologies  
Applications of Quantitative Methodologies in Program Evaluation  
Quantitative Methods for Data Collection  
The Management of Data Collection for Quantitative Studies  
Data Analysis  
Summary  
Further Reading  
 
13. Qualitative Methods
Design  
Data Collection  
Data Analysis  
Reporting  
Real-World Constraints  
Summary  
Further Reading  
 
14. Mixed-Method Evaluation
The Mixed-Method Approach  
Rationale for Mixed-Method Approaches  
Approaches to the Use of Mixed Methods  
Mixed-Method Strategies  
Implementing a Mixed-Method Design  
Using Mixed Methods to Tell a More Compelling Story of What a Program Has Achieved  
Case Studies Illustrating the Use of Mixed Methods  
Summary  
Further Reading  
 
15. Sampling and Sample Size Estimation for RealWorl Evaluation
The Importance of Sampling for RealWorld Evaluation  
Purposive Sampling  
Probability (Random) Sampling  
Using Power Analysis and Effect Size for Estimating the Appropriate Sample Size for an Impact Evaluation  
The Contribution of Meta-Analysis  
Sampling Issues for Mixed-Method Evaluations  
Sampling Issues for RealWorld Evaluation  
Summary  
Further Reading  
 
16. Evaluating complex projects, programs and policies
The Move Toward Complex, Country-Level Development Programming  
Defining complexity in development programs and evaluations  
A framework for the evaluation of complex development programs  
Summary  
Further Reading  
 
17. Gender Evaluation: Integrating Gender analysis into evaluations
Why a Gender Focus is Critical  
Gender Issues in Evaluations  
Designing a Gender Evaluation  
Gender Evaluations with Different Scopes  
The tools of Gender Evaluation  
Summary  
Further Reading  
 
18. Evaluation in the age of big data
Introducing big data and data science  
Increasing application of big data in the developmental context  
The stages of the data analytics cycle  
Potential applications of data science in development evaluation  
Building bridges between data science and evaluation  
Summary  
Further Research  
 
19. Managing Evaluations
Organizational and Political Issues Affecting the Design, Implementation, and Use of Evaluations  
Planning and Managing the Evaluation  
Institutionalizing Impact Evaluation Systems at the Country and Sector Levels  
Summary  
Further Reading  
 
20. Conclusions and Challenges on the Road Ahead
The challenge of assessing impacts in a world in which many evaluations have a short-term focus.  
The continuing debate on the “best” evaluation methodologies  
Selecting the Appropriate Evaluation Design  
Mixed Methods: The Approach of Choice for Most RealWorld Evaluations  
How Does RealWorld Evaluation Fit into the Picture?  
Quality Assurance  
Need for a strong focus on gender equality and social equity  
Basing the Evaluation Design on a Program Theory Model  
The Importance of Context  
The Importance of Process  
Dealing with complexity in development evaluation  
Emergence  
Integrating the new information technologies into evaluation  
Greater Attention Must Be Given to the Management of Evaluations  
The Importance of Competent Professional and Ethical Practice  
Developing Standardized Methodologies for the Evaluation of Complex Programs  
Creative Approaches for the Definition and Use of Counterfactuals  
Strengthening Quality Assurance and Threats to Validity Analysis  
Defining Minimum Acceptable Quality Standards for Conducting Evaluations under Constraints  
Further Refinements to Program Theory  
Further Refinements to Mixed-Method Designs  
Integrating big data and data science into program evaluation  
Further work is required to strengthen the integration of a gender-responsive approach into evaluation programs  
 
References

Preview this book

For instructors

This book is not available as an inspection copy. For more information contact your local sales representative.

Select a Purchasing Option


Paperback
ISBN: 9781544318783
£68.00