This Giving Season, improve your effectiveness as a donor with CEP’s resources for individual givers.

Contact Us

Search

Blog

Evaluation: The Journey and the Destination

Date: November 15, 2018

Jehan Velji

Director, Effective Philanthropy Group, William and Flora Hewlett Foundation

Teresa Power

Portfolio Manager and Director of Portfolio Operations, Edna McConnell Clark Foundation

Never Miss A Post

Share this Post:

A new CEP report titled Understanding & Sharing What Works highlights the challenges foundations face in understanding what does and doesn’t work programmatically, deciding what to do with that information, and sharing what they have learned. At the Edna McConnell Clark Foundation (EMCF), we’ve encountered some of the same challenges on our journey with evidence building and evaluation. In hopes that it will be helpful to others in the field, we would like to share some of what we’ve learned, how we’ve evolved our thinking, and where we are heading.

In 2011, EMCF partnered with the federal Social Innovation Fund, which was created to identify and scale effective programs, to select 12 grantees with strong outcomes and support each of them in undertaking a rigorous evaluation. Most of the grantees, which we carefully selected after in-depth due diligence, aimed to complete a randomized controlled trial (RCT), generally considered the most rigorous type of third-party evaluation.

Our goal in investing in these evaluations was to build the evidence base of what works for children and youth, and then apply this knowledge to help scale effective programs. We engaged experts, including our own evaluation advisory committee, to help us assess grantees’ readiness for evaluation, explore different evaluation designs, and develop evaluation plans. Although we knew the findings wouldn’t be as simple as “thumbs-up or thumbs-down” ratings of these programs’ effectiveness, we were optimistic about the potential for these programs to demonstrate consistently positive outcomes — and to strengthen their evidence bases.

Despite all the upfront analysis we undertook — including supporting implementation studies and testing the feasibility of achieving the sample size — our experience taught us that even the most carefully considered evidence-building process can be much more challenging than originally anticipated. After a few years and several millions of dollars invested, our grantees’ evaluations yielded, with a couple of exceptions, mixed results. Instead of a deeper and relatively straightforward assessment of whether or not a program worked, we had headscratchers. For example, some evaluations showed statistically significant positive results in one area, but surprisingly not in others.

We wondered whether some of the grantees had been as ready for an RCT as we initially thought, whether the timeframe we had imposed had allowed enough time to get to the impacts that we and the grantees were most interested in, and whether the designs of some of the evaluations provided answers to the questions we all wanted to answer. Our grantees were committed to impact and were brave enough to go through rigorous evaluations, so it was important to all of us ― the grantee, the evaluator, and the funder ― to figure out how to think about and process the results. EMCF is a performance-based investor, but given all the open questions these evaluations raised, it was difficult to determine how to best use the information from the RCTs to inform our investment decisions.

As we dug deeper to understand why these evaluations produced surprising results, we rethought how our learnings could help us most productively engage in evaluation-related activities with grantees. Specifically, we realized the need to consider more carefully where grantees were in the course of their evidence-building journeys. We recognized that some grantees would need to work on enrolling larger sample sizes, while others would need more time to show meaningful results. For some grantees, a less rigorous design might be more appropriate given the stage of their programs’ development, while for others an implementation study might be the best place to start as they codified their interventions.

On further reflection, we came to understand that the most important goal of evaluation is not to determine whether a program works or doesn’t work, but to discover how to make a program work better over time. We came to understand much more explicitly how rigorous evaluation should be viewed as one important step along a much broader journey of learning and continuous improvement.

Our grantees, many of which already embraced the spirit of continuous learning and improvement, rose to the challenge and used the evaluation findings to inform learning agendas, including hypotheses for program improvements to test in the future. At the same time, we at the foundation have continued to look for the most rigorous approaches to evidence building that are suitable to a particular intervention. We have also begun to explore the potential of less costly and more rapid approaches to evidence building that could help grantees and their funders learn more quickly whether program modifications are effective.

We remain committed to investing in RCTs when appropriate and to maintaining the highest level of rigor possible when assessing evidence. We also know funders need to innovate to find additional evidence-building approaches, while at the same time being more flexible in how we support grantees to build evidence and learn more about their programs.

As EMCF pursues its limited-life strategy, we are eager to share more of our learnings in ways that will be most useful to other funders and practitioners in support of better outcomes for youth. As performance-based investors, we see the value of a data-driven culture, a commitment to learning, and a willingness to test program modifications in order to improve. Evaluation is not only about assessing the effectiveness of program interventions; it’s also a journey of ongoing learning and improvement to ensure we are doing our best to deliver impact for those we seek to help.

Jehan Velji is senior portfolio director at the Edna McConnell Clark Foundation (EMCF). Teresa Power is portfolio manager & director of portfolio operations at EMCF. Follow EMCF on Twitter at @emclarkfdn.

Editor’s Note: CEP publishes a range of perspectives. The views expressed here are those of the authors, not necessarily those of CEP.

From the Blog

An AI Roadmap for Philanthropy in 2025
An AI Roadmap for Philanthropy in 2025

As grantmaking organizations increasingly explore how AI tools can transform the way we work in civil society, the Technology Association of Grantmakers (TAG) recently released results from a global survey of grantmakers in our 2024 State of Philanthropy Tech report....

read more