For funders planning to evaluate or review their strategies, a wide range of resources on program evaluation and assessment can provide guidance on methodology, data use, and analysis. And yet, the available literature on how to structure foundation-wide strategic review processes can be a black hole. Without existing structures in place, kicking off a strategy review can be a real challenge.
The way that funders structure these strategy review processes — how they examine the effectiveness of their strategies, revisit the fundamental assumptions and approaches of their work, and make funding decisions — affects their direction, impact, external relationships, and even internal culture. Given this, the lack of information on structuring a strategy review poses challenges for funders trying to plan for this process as intentionally and effectively as possible.
This was the challenge the William Penn Foundation encountered when trying to proactively design their next strategic review process. Hungry to make evidence-based decisions and seeking to learn from their peers, the Foundation commissioned CEP to fill this gap in research and best practices. And so last year, CEP interviewed 20 foundation presidents, data and evaluation directors, program directors, and other staff from 13 funders similar to the William Penn Foundation in asset and giving size, programmatic areas, regional focus, and/or family board governance.
We’ve written up our findings in a new publication, just released today.
CEP’s work studying grantee relationships and staff experiences — along with our expertise in confidential surveys and interviews — set our team up to hit the ground running on this project. Through phone interviews and a short online survey, we asked participants about a range of topics related to the planning, structure, implementation, and outcomes of their most recent strategy reviews:
- How did your foundation determine the timeframe for a comprehensive review? Do you conduct a review every five years, based on changes in your fields or institution, or does it vary by each program’s goals?
- When and how do you communicate with grantees during a strategy review — before starting or after board decisions or other milestones? Through blog posts and other external communications and/or through individual conversations?
- Are consultants worth the investment, and if so, what role should they play?
- How do considerations of systemic inequities play a role in information gathering, evaluating, and decision-making?
- What are the unanticipated challenges you encountered and what were the key lessons learned?
Though this group of 13 funders used varied approaches, common elements shone through in what we heard from them. For example, nearly all interviewees described the value of seeking candid, comprehensive feedback on their work from a diverse group of stakeholders — including grantees, local leaders, other funders, and field experts.
Of course, specific evaluation methodologies varied by the focus of each program under review. Still, project participants described a range of ways through which they seek input on their work. These include:
- Data-driven analysis. Funders drew insights from grant evaluations, landscape analyses, literature reviews, demographic projections, and data about the ultimate beneficiaries of the funded work.
- Discussions with diverse stakeholders, including grantees and other nonprofit organizations. To gain the perspectives of those outside of their usual networks, one foundation leader described asking grantees, “Who else should we be talking to?”
- Personal narratives. “Bringing the community voices to the table…really matters,” noted one interviewee. “Their voices are incredibly authentic, and for our stakeholders, as much as they love data and data-driven decision-making, it’s the narrative that everyone remembers.”
- Closing the loop. After gathering external input through interviews, informal conversations, and community meetings, several funders noted the importance of sharing what they’ve learned with the individuals and organizations who lent their time and expertise.
Looking internally, nearly all participating funders — unprompted — raised staff capacity as a key challenge in their most recent strategy review, describing the work required as “daunting” and “easy to underestimate.” One participant described feeling that the demand on programmatic staff and foundation leadership felt like a second job. Though there is no easy answer to this challenge, we heard a range of actionable recommendations from interviewees, including determining the parts of the process in which consultants can be helpful (and which aspects should be staff-led), when to stop collecting data, and how to set expectations early in the timeline to avoid rework.
A full description of the components, challenges, and considerations that these interviews revealed — as well our methodologies, findings, and recommendations — can be found in the report, which is available for free download here.
This report was completed with thanks to Hilary Rhodes at the William Penn Foundation, who helped shape the project’s focus and design. The opinions expressed in this report are those of the authors and do not necessarily reflect the view of the William Penn Foundation.
The project would not have been possible without the insights, candid reflections, and suggestions shared by interviewees from the 13 participating foundations: Barr Foundation, Charles and Lynn Schusterman Family Philanthropies, Ewing Marion Kauffman Foundation, Hillman Family Foundation, Houston Endowment, John D. and Catherine T. MacArthur Foundation, Richard King Mellon Foundation, Surdna Foundation, The David and Lucile Packard Foundation, The Heinz Endowments, The McKnight Foundation, The William and Flora Hewlett Foundation, and Walton Family Foundation. We are grateful and appreciative for their time and insights.
CEP lends its philanthropic knowledge, professional network, and survey and interview expertise to funders on customized projects that address their specific needs, questions, and contexts. More information can be found on CEP’s Advisory Services page or by reaching out to CEP Director, Assessment and Advisory Services, Mena Boyadzhiev at menabcep.org.
Mena Boyadzhiev is Director, Assessment and Advisory Services, at CEP. Alina Tomeh is Senior Analyst, Assessment and Advisory Services, at CEP.