Equitable Evaluation in Practice: Towards More Inclusive, Just, and People-Centered Practices

Ana Jackson, Ph.D., Jessica Mindnich, Ph.D., and Sonia Moldovan

As practitioners of the Equitable Evaluation Framework, we are committed to putting the people we are here to serve at the center of our work. We view evaluation as a tool for advancing justice and liberation — and we also acknowledge that this work is hard, messy, and imperfect. Here are some ways we’ve fostered more equitable approaches to evaluation.

A Framework for Equitable Learning and Evaluation

Across our organizations, the issues we tackle are the result of deep-rooted, systemic issues: gender and economic biases and racism. Our strategy is to effect systems change by addressing these root causes.

The pathways between actions and outcomes for these pervasive and highly complex issues are long, convoluted, and involve countless systems, organizations, and actors. The ultimate indicators of success — including equalizing health outcomes across all population groups — will emerge over many years.

Evaluation and learning helps provide information and data that are essential to strategic thinking, and evaluation produces learning, stories, information, and understanding to support grantee empowerment and sustainability and foundation and grantee decision-making. It can also help build knowledge and evidence for the field. Evaluation and learning address critical questions, including:

  • To what extent and in what ways are we living into our values and fulfilling our missions?
  • How can we enhance our influence and impact?
  • How can we center grantee voice and community voice in learning that is meaningful and useful to partners?

In our evaluation and learning practice, we also want to ensure that we’re not inadvertently contributing to the root causes we want to change. Evaluation and learning must be in service of and contribute to equity, yielding both more meaningful and relevant insights and information and greater equity — in our practices, community partners, and the questions we ask. Evaluation can have a powerful role in ensuring that individuals and communities both share equitably in the knowledge, wealth, and resources of society and contribute to their creation.

Effectively Integrating Community Voices

With the above approach in mind, learning with and from communities means that we include them at all stages of learning — from design to implementation, meaning making, and dissemination.

Learning in Design

In launching a new program aimed at homelessness prevention, a developmental evaluation began with community listening sessions. Insight from people who were likely to experience unstable housing was critical in informing the program and evaluation design. Through these early focus groups, we gleaned insights into the prevalence of, factors contributing to, and potential solutions for unstable housing. And to ensure that these sessions were attended by low-income residents, sessions were held in trusted community centers where participants were provided childcare and translation, compensated for their time, and reimbursed for transportation.

Learning in Implementation

In a 20-year retrospective evaluation of a scholarship program, the evaluation was initiated with an advisory committee that included both program staff and scholars. Advisory committee members are compensated for their time and expected to participate through the entirety of the evaluation — from design to meaning making. Additionally, as part of a multi-method approach, each interviewed scholar subsequently interviewed one person who was inspired by their college journey.

Learning in Meaning-Making

A developmental evaluation supporting real-time learning was launched in conjunction with a racial equity community of practice. The design and approach of the evaluation centered the voices and experiences of participants and respected participants as the primary beneficiaries of community generated learnings. Throughout the 18-month cohort, participants completed interviews, focus groups, surveys, and journals while also participating in community meaning making sessions. During these sessions, participants and facilitators reflected upon and generated meaning from the group’s data while also applying learnings that led to real-time adjustments to the program design.

Sharing What We Learn

It is important that those who have participated in generating knowledge are considered the first audience in disseminating learnings. While this sounds easy enough, in practice, it takes time and intention. Nonetheless, we have worked toward systematizing a process in which learnings are shared with foundation leads and program beneficiaries and grantees, then with the broader foundation and key stakeholders, and finally with the public. Sharing learning in concentric circles, starting with those at the center of our learning and moving out, is a way to ensure engagement, transparency, and respect for the time and experiences of those involved.

Approaches to Funding Equity in Learning

The Conrad N. Hilton Foundation — where one of us (Sonia) leads strategy, learning, and evaluation — prioritizes investments in research and evaluation to foster transparency, learning, and continuous improvement and stay true to its values of humility and stewardship. That commitment includes seeking to understand the Foundation’s contribution towards equitable outcomes, providing opportunities for key stakeholders to be heard, especially those affected by investments, and sharing insights with the wider community.

Prioritizing evaluation and research required a shift in the Foundation’s grantmaking practice. In 2020, the organization revised its strategies and embedded research and evaluation as a key component of the new framework. Leadership also set a target of investing 10-15 percent of the Foundation’s approximately $300 million annual budget in research and evaluation and obtained approval from the Board. To guide these investments, learning and evaluation officers, in coordination with the program team, conducted evidence reviews, developed learning agendas, and are piloting different approaches to make space for listening and learning. These approaches include:

  1. Planning grants: In addition to long-term funding, the Hilton Foundation invests in planning grants of 6-12 months. During this period, grantees co-design programs with key stakeholders and identify outcomes that are most important to them.
  2. Technical support: The Foundation works with three evaluation partners who provide technical support to grantees in East Africa, West Africa, and the U.S. The support is tailored to the specific research and evaluation needs of the organization and not linked to funding.
  3.  Advisory groups: Strategy-level evaluation work is supported by advisory groups consisting of program participants who are highly knowledgeable about the issues we are addressing. Westat, an evaluation partner for the Homelessness Initiative, regularly consults with an expert who is familiar with the complexities of receiving services from the system supporting those who are unhoused. Child Trends, an evaluation partner for the Foundation’s Foster Youth Initiative, has hired two associates who have experienced foster care and is supporting them to become future evaluators.

Avoid the False Dichotomy Between Community and Foundation

Community members may believe that they know the issues and priorities best and no need exists for the data that evaluation provides. Partnering with communities to learn requires avoiding a false dichotomy between community knowledge/data and foundation data. It’s vital to learn from partners with different perspectives in ways that they feel heard. While communities best know their own contexts, foundation partners and researchers can also share what they’ve learned in different communities and contexts.

Evaluation and learning are critical components of the grantmaking process for grantees, participants, and funders. By working together, we can ensure that our investments are making a measurable impact and achieving their intended outcomes. Evaluation provides an opportunity to be transparent, strengthen our learning practices, empower our community partners, and work toward equity in all that we do.

Ana Jackson, Ph.D. is chief evaluation and data strategy director at Blue Shield of California Foundation. Jessica Mindnich, Ph.D. is the senior director of strategic learning and evaluation at the Mellon Foundation. Sonia Moldovan is director of strategy, learning, and evaluation at the Conrad H. Hilton Foundation.

SHARE THIS POST
equity, evaluation, learning, measurement
Previous Post
To Boldly Give: Learning from the Impact of MacKenzie Scott’s Gifts to LGBTQ Organizations
Next Post
Field Catalysts: The Versatile, Essential Tool Missing from Philanthropy’s Systems Change Toolbox

Related Blog Posts