You are here

Evaluation for Health Policy and Health Care
Share

Evaluation for Health Policy and Health Care
A Contemporary Data-Driven Approach

Edited by:
Additional resources:


November 2019 | 336 pages | SAGE Publications, Inc
This is the contemporary, applied text on evaluation that your students need.

Evaluation for Health Policy and Health Care: A Contemporary Data-Driven Approach explores the best practices and applications for producing, synthesizing, visualizing, using, and disseminating health care evaluation research and reports. This graduate-level text will appeal to those interested in cutting-edge health program and health policy evaluation in this era of health care innovation. Editors Steven Sheingold and Anupa Bir’s core text focuses on quantitative, qualitative, and meta-analytic approaches to analysis, providing a guide for both those executing evaluations and those using the data to make policy decisions. It is designed to provide real-world applications within health policy to make learning more accessible and relevant, and to highlight the remaining challenges for using evidence to develop policy.
 
 
List of Figures and Tables
 
Preface
 
Acknowledgments
 
About the Editors
 
PART I. SETTING UP FOR EVALUATION
 
Chapter 1. Introduction
Background: Challenges and Opportunities  
Evaluation and Health Care Delivery System Transformation  
The Global Context for Considering Evaluation Methods and Evidence-Based Decision Making  
Book’s Intent  
 
Chapter 2. Setting the Stage
Typology for Program Evaluation  
Planning an Evaluation: How Are the Changes Expected to Occur?  
Developing Evaluations: Some Preliminary Methodological Thoughts  
Prospectively Planned and Integrated Program Evaluation  
Summary  
 
Chapter 3. Measurement and Data
Guiding Principles  
Measure Types  
Measures of Structure  
Measures of Process  
Measures of Outcomes  
Selecting Appropriate Measures  
Data Sources  
Looking Ahead  
Summary  
 
PART II. EVALUATION METHODS
 
Chapter 4. Causality and Real-World Evaluation
Evaluating Program/Policy Effectiveness: The Basics of Inferring Causality  
Defining Causality  
Assignment Mechanisms  
Three Key Treatment Effects  
Statistical and Real-World Considerations for Estimating Treatment Effects  
Summary  
 
Chapter 5. Randomized Designs
Randomized Controlled Trials  
Stratified Randomization  
Group Randomized Trials  
Randomized Designs for Health Care  
Summary  
 
Chapter 6. Quasi-experimental Methods: Propensity Score Techniques
Dealing With Selection Bias  
Comparison Group Formation and Propensity Scores  
Regression and Regression on the Propensity Score to Estimate Treatment Effects  
Summary  
 
Chapter 7. Quasi-experimental Methods: Regression Modeling and Analysis
Interrupted Time Series Designs  
Comparative Interrupted Time Series  
Difference-in-Difference Designs  
Confounded Designs  
Instrument Variables to Estimate Treatment Effects  
Regression Discontinuity to Estimate Treatment Effects  
Fuzzy Regression Discontinuity Design  
Additional Considerations: Dealing With Nonindependent Data  
Summary  
 
Chapter 8. Treatment Effect Variations Among the Treatment Group
Context: Factors Internal to the Organization  
Evaluation Approaches and Data Sources to Incorporate Contextual Factors  
Context: External Factors That Affect the Delivery or Potential Effectiveness of the Treatment  
Individual-Level Factors That May Cause Treatment Effect to Vary  
Methods for Examining the Individual Level Heterogeneity of Treatment Effects  
Multilevel Factors  
Importance of Incorporating Contextual Factors Into an Evaluation  
Summary  
 
Chapter 9. The Impact of Organizational Context on Heterogeneity of Outcomes: Lessons for Implementation Science
Context for the Evaluation: Some Examples From Centers for Medicare and Medicaid Innovation  
Evaluation for Complex Systems Change  
Frameworks for Implementation Research  
Organizational Assessment Tools  
Analyzing Implementation Characteristics  
Summary  
 
PART III. MAKING EVALUATION MORE RELEVANT TO POLICY
 
Chapter 10. Evaluation Model Case Study: The Learning System at the Center for Medicare and Medicaid Innovation
Step 1: Establish Clear Aims  
Step 2: Develop an Explicit Theory of Change  
Step 3: Create the Context Necessary for a Test of the Model  
Step 4: Develop the Change Strategy  
Step 5: Test the Changes  
Step 6: Measure Progress Toward Aim  
Step 7: Plan for Spread  
Summary  
 
Chapter 11. Program Monitoring: Aligning Decision Making With Evaluation
Nature of Decisions  
Cases: Examples of Decisions  
Evidence Thresholds for Decision Making in Rapid-Cycle Evaluation  
Summary  
 
Chapter 12. Alternative Ways of Analyzing Data in Rapid-Cycle Evaluation
Statistical Process Control Methods  
Regression Analysis for Rapid-Cycle Evaluation  
A Bayesian Approach to Program Evaluation  
Summary  
 
Chapter 13. Synthesizing Evaluation Findings
Meta-analysis  
Meta-evaluation Development for Health Care Demonstrations  
Meta-regression Analysis  
Bayesian Meta-analysis  
Putting It Together  
Summary  
 
Chapter 14. Decision Making Using Evaluation Results
Research, Evaluation, and Policymaking  
Program/Policy Decision Making Using Evidence: A Conceptual Model  
Multiple Alternatives for Decisions  
A Research Evidence/Policy Analysis Example: Socioeconomic Status and the Hospital Readmission Reduction Program  
Other Policy Factors Considered  
Advice for Researchers and Evaluators  
 
Chapter 15. Communicating Research and Evaluation Results to Policymakers
Suggested Strategies for Addressing Communication Issues  
Other Considerations for Tailoring and Presenting Results  
Closing Thoughts on Communicating Research Results  
 
Appendix A: The Primer Measure Set
 
Appendix B: Quasi-experimental Methods That Correct for Selection Bias: Further Comments and Mathematical Derivations
Propensity Score Methods  
An Alternative to Propensity Score Methods  
Assessing Unconfoundedness  
Using Propensity Scores to Estimate Treatment Effects  
Unconfounded Design When Assignment Is at the Group Level  
 
Index

Supplements

Instructor Resource Site
study.sagepub.com/sheingold1e

Password-protected Instructor Resources include the following:
  • Editable, chapter-specific Microsoft® PowerPoint® slides offer you complete flexibility in easily creating a multimedia presentation for your course. 
  • Lecture Notes, including Outline and Objectives, which may be used for lecture and/or student handouts.
  • Case studies from SAGE Research Methods accompanied by critical thinking/discussion questions.  
  • Tables and figures from the printed book are available in an easily-downloadable format for use in papers, hand-outs, and presentations.
 
Student Study Site

Open-access Student Resources include case studies from SAGE Research Methods accompanied by critical thinking/discussion questions.

 “This text offers a general introduction to the process and methods used to conduct rigorous and timely evaluations of health policies and programs using real-world examples. It would make an excellent text for a program evaluation course.”

Brad Wright
University of Iowa

“A must read for anyone interested in monitoring and evaluation! The text does a great job addressing the important ingredients for a successful evaluation.”

Sandra Schrouder
Barry University

 “Evaluating health policies and programs can be a very challenging process because the evaluation itself is so often an afterthought, leading to a variety of data issues that can produce biased results and poor policy decisions. This book provides an outstanding–yet highly accessible–overview of a wide variety of methods that evaluators can use to minimize these biases and generate robust evidence for decision-makers.”

Larry R. Hearld
University of Alabama at Birmingham

Preview this book

For instructors

This book is not available as an inspection copy. For more information contact your local sales representative.

Select a Purchasing Option


Paperback
ISBN: 9781544333717
£70.00