Innovative Methods for Designing Actionable Program Evaluation: Behind the Article with Authors Brandon Nesbit and Sally Thigpen
Hi there! We are Brandon Nesbit and Sally Thigpen, two of the authors of “Innovative Methods for Designing Actionable Program Evaluation.” It’s one of the papers in Catalyzing State Public Health Agency Actions to Prevent Injury and Violence, a supplement of the Journal of Public Health Management and Practice.
Funded programs are expected to demonstrate implementation progress and outcome impact, no matter whether that funding originates from governmental agencies or non-governmental organizations. Program evaluation allows programs to describe impact and identify areas for program improvement. Program evaluation provides another critical advantage for funders. It describes both the independent and the collective impact of individual programs. Aggregation, however, can be challenging when the strategies implemented or data collected vary across partners with differing needs. This was the case with CDC’s Core Violence and Injury Prevention Program (Core VIPP).
What was happening:
Injury-related state health department capacity and infrastructure have remained limited even though injury and violence have been recognized as public health issues for several decades. For instance, monitoring health status to identify and solve community health problems is the first of the 10 essential functions of public health services, but many state programs have historically lacked access to an epidemiologist, statistician, or other data professional with expertise for this work.
CDC funds Core VIPP to address this gap. Core VIPP was designed to assist states in building and/or maintaining their delivery systems for disseminating, implementing, and evaluating best practice injury and violence prevention strategies. Core VIPP evaluation was designed to capture the outcomes associated with strengthening public health infrastructure and implementing evidence-based strategies across many state programs. This evaluation needed to reflect a wide range of evidence quality and community context such as partnerships and policy realities. We developed an actionable evaluation process that provides rapid-cycle feedback to support successful, nimble implementation. It also depends upon partnerships to ensure timely dissemination and data updating to alter program trajectories. Our evaluation team used this design to (1) capture changes in recipient capacity to implement injury prevention work; (2) describe the association between recipient activities and injury-related outcomes; and (3) implement continuous quality improvement processes.
What we learned:
Our actionable evaluation process balanced the need for activities and indicators that reflected the local context of Core VIPP awardees with the ability to describe the program’s collective impact. The reality of this balance is that there is a natural tension between aggregating programmatic data and being responsive to differing contexts. We learned a lot about successes and challenges of making evaluation actionable at a state and federal level.
Lessons learned:
- Using surveys for measuring organizational capacity and collecting information through interviews or site visits with key staff is vital to obtain a comprehensive understanding.
- Developing predefined indicators and requiring all awardees to submit these will ensure consistency when you intend to aggregate quantitative data across funded entities.
- Communicating evaluation results as timely, easy-to-understand feedback is necessary for stakeholder uptake and use. Moving beyond static reports and presentations to innovative approaches such as interactive dashboards is increasingly important, and data comes to life when quantitative results are paired with narrative success stories.
- Strong, active leadership is vital to successful peer-to-peer communities of practice. They will not thrive on membership input alone.
What action can be taken:
Lessons learned from this work have informed the current structure cycle, Core State Violence and Injury Prevention Program (SVIPP). We used these lessons learned to inform the subsequent funding structure cycle, Core SVIPP, 2016-2021. This included requirements for evidence-based implementation and a state support team structure for training and technical assistance delivery.
We suggest including partners in the evaluation development and implementation if you want to implement actionable evaluation. This allows for more informed mid-stream course corrections. We encourage bidirectional rapid feedback to inform policy, funding, and programmatic decisions. Lastly, we suggest that annual reporting be a combination of a fixed standards qualitative report and quantitative data. You can read more about our Core VIPP actionable evaluation in the full article here!
http://journals.lww.com/jphmp/Fulltext/2018/01001/Innovative_Methods_for_Designing_Actionable.4.aspx
Nesbit, Brandon MPH; Hertz, Marci MPH; Thigpen, Sally MPA; Castellanos, Ted MPH; Brown, Michelle; Porter, Jamila DrPH; Williams, Amber. Innovative Methods for Designing Actionable Program Evaluation. Journal of Public Health Management and Practice: January/February 2018 – Volume 24 – Issue – p S12–S22; doi: 10.1097/PHH.0000000000000682.
The findings and conclusions in this blog are those of the authors and do not necessarily represent the official position of the Centers for Disease Control and Prevention.
Read the full issue of our special supplement Catalyzing State Public Health Agency Actions to Prevent Injury and Violence.
- Introduction to the Special Issue: Catalyzing State Public Health Agency Actions to Prevent Injuries and Violence
- Mind the Gap: Approaches to Addressing the Research-to-Practice, Practice-to-Research Chasm
Brandon Nesbit has been at CDC for 6 years as a Health Scientist in the National Center for Injury Prevention and Control. He is the lead evaluator for two state funded cooperative agreements and leads data collection and management efforts for a third program in the Center. Brandon’s interests include development of evaluation metrics and indicators, data collection processes, dissemination of surveillance and evaluation findings through data visualization and use of data for decision making. Before coming to CDC, Brandon evaluated the Hospital Preparedness Program for the state of Georgia. Brandon has an MPH from the University of Georgia.

Sally Thigpen
Sally Thigpen is a Health Scientist with the Division of Analysis, Research, and Practice Integration (DARPI). She provides support across the Injury Center for program evaluation and research, with specific expertise in actionable knowledge to promote behavioral and social change. Sally received her Bachelor of Arts in Sociology and Anthropology from Agnes Scott College and a Master of Public Administration that incorporated behavioral theory and research from the Andrew Young School of Public Policy at Georgia State University. She is a member of and serves on Agnes Scott College’s Public Health Advisory Committee.
Author Profile

Latest entries
JPHMP Direct VoicesJuly 6, 2023Dr. Katie Schenk Is Now on Substack
Students of Public HealthJanuary 23, 2023Students Who Rocked Public Health 2022
Students of Public HealthDecember 1, 2022Deadline Extended to Nominate a Student Who Rocked Public Health in 2022
JPHMP Direct VoicesOctober 19, 2022Preview Issue for Public Health Workforce Interests and Needs Survey