This Giving Season, improve your effectiveness as a donor with CEP’s resources for individual givers.

Contact Us

Search

Blog

Fueling Change Through Feedback

Date: December 6, 2012

Kevin Bolduc

Vice President, Assessment and Advisory Services, CEP

Never Miss A Post

Share this Post:

All the survey feedback in the world doesn’t matter even a tiny bit unless it’s acted on. So the test of the Grantee Perception Report (GPR) isn’t whether a foundation receives interesting grantee feedback, but whether it is able to create productive change based on the feedback it receives.

We knew that foundations told us this feedback mattered to them, but it took us a few years before we had the data to know that over time grantees’ experiences were changing for the better for many users of the GPR. Even if we can’t claim causality—that the GPR was the primary factor in those changes—we have enough evidence to suggest an important role for the GPR that I’m convinced this tool matters for funders and for grantees.

In my last post, I talked about the first hurdle of getting funders to embrace comparative data. The second challenge was working with foundations to ensure they had the resources and guidance to drive change. We’ve always spent a lot of time adjusting the survey and the report on the advice of the dedicated early GPR users. We added and deleted survey questions, conducted research, and created case studies based on the challenges they told us about. As you’ll see later in this post, that continues today. Just like GPR users, we try hard to act on the feedback we receive to ensure the GPR maintains its utility for the sector.

Funders Acting on GPR Feedback

There were good early signs that funders were taking the feedback very seriously. One of the first came in a surprising way – a public declaration in 2004 from The William and Flora Hewlett Foundation about the strengths and weaknesses they saw in their GPR results, a link to the GPR results on their website, and a commitment from then President Paul Brest to grantees to work on improving aspects of the Foundation’s work. He stated:

“We believe that success in the field of philanthropy requires transparency and a real desire to learn about what works and what needs strengthening. We’re taking an honest, serious look at how we can best achieve our goal of being an effective grantmaking organization, and we’re using our grantees’ input to better understand how to do this.”

A foundation talking publicly about feedback from its grantees was so surprising that The New York Times reported on it in an article called “Charities Surprise Donor Foundations with Bluntness.” (The headline, of course, didn’t do justice to the fact that many of these foundations also learned that they had real strengths in areas important to them.)

Now several dozen foundations, including eight of the ten largest funders in the country, have publicly posted excerpts of GPRs and commitments to improvement on their websites. Some, like The Rhode Island Foundation, the David and Lucile Packard Foundation, and The William and Flora Hewlett Foundation, post their new GPR results every two or three years in a growing series of feedback and change.

Seeing that foundations were willing to share their feedback was an encouraging sign that the GPR was providing useful feedback. And as I’ve written previously, many funders that don’t make GPR results transparent also make commitment to improving their work.

In an effort to understand what changes might be happening at foundations that were committed to driving change with GPR feedback, we began to commission LFA Group to understand GPR users’ experience and improvement efforts. (Yes, we practice what we preach about the need for unbiased, candid feedback from stakeholders.)

Over the years, we’ve seen that the vast majority of GPR users—95 percent—report making “some” or “significant” changes in response to their GPRs, most significantly on aspects of communications with grantees, attitudes toward work with grantees, and grantmaking processes. This evidence was important, but it was self-reported. We—and the foundations using the GPR—were looking for more. (You can see all of our LFA 3rd party assessments on our website here).

In 2010, we felt we’d amassed enough funders that had repeated the GPR—59—that our research team could analyze whether their results were changing. And they were! On a number of important measures in the survey—most significantly on grantees’ ratings of funders’ impact on their organizations—we saw statistically significant and meaningful changes in grantees’ responses to CEP’s grantee survey between the first and second time a funder used the GPR.

Now we knew that thousands of grantees’ experiences were changing for the better with many of the funders that had committed to using the GPR more than once to drive improvement. We profiled these findings in a report entitled “Can Feedback Fuel Change at Foundations?” It was a huge validation of the power of the comparative data in the GPR coupled with dedicated funders’ commitments to improvement. These results drive me every day to encourage more funders to take advantage of the GPR.

We’re Acting on User Feedback about the GPR

Earlier, I mentioned how CEP uses LFA surveys to gauge GPR users’ satisfaction and experience acting on GPR results. They’ve also been a major way in which we’ve taken the advice of users to drive improvement in the GPR and our ability to help foundations interpret and act on data. Since we first began these assessments in 2005, each year we’ve seen evidence that the GPR is highly valued, that users would recommend the tool to others, and that it influences participating funders to make substantive changes in their work.

In 2011, though, we saw some of the LFA survey ratings drop. And we saw some evidence that subscribers wanted more from the GPR. The same trends held true in 2012. The results, available on our website here, show that:

  • GPR subscribers still report high levels of satisfaction with the GPR experience, but these ratings continued in a downward trend from 2011. Some point out that CEP staff could do a better job in our presentations of GPR results. They want our staff to do even more to understand the foundation context and how the GPR data fits into their history, their strategy, and their operations.
  • Even as subscribers continue to rate the GPR as highly valuable relative to its cost, these ratings have decreased since 2009—a period of time during which we raised the price of the GPR to get to economic break-even.
  • The level of reported change has decreased, particularly for repeat GPR users (some of whom have used the tool three, four, and five times.) Eighty-three percent of GPR users reported making changes based on the results of their GPR, with nearly half reporting having made “significant change,” but these levels of change are lower than in previous years. Users ask for better synthesis and clearer recommendations.

I’ve always had compassion for the funders we worked with as they seek to create change in their work and organizations—it’s not easy to receive feedback that isn’t as positive as you’d hoped for. Now I can empathize with funders, I think, even a bit more deeply. We were disappointed with our results, but they also focused us on the opportunity to do more.

We’ve been constantly modifying the GPR over the years, but ten years after creating the first GPR, these results suggest it was time to update the tool and our work more significantly in response to users’ evolving needs—especially the regular users of the GPR who are looking for more opportunity to customize their analysis and the survey itself. While we’ve begun some changes, we’ll be looking to users and potential users to continue to guide us, as we always have.

We’re determined to keep learning—not just from the feedback itself, but from the experience of making changes based on candid feedback. I am confident that this experience will strengthen our ability to provide insight and guidance to the funders we work with. After all, we’re going through the same thing some of them have, and we’re taking inspiration from funders like Endowment for Health, the David and Lucile Packard Foundation, Robert Wood Johnson Foundation, and The Wallace Foundation, each of whose improvement processes we’ve profiled in CEP’s research. I’d like to thank everyone who took the time to complete the survey and provide us with candid feedback. We’re grateful for this clear input on how we can improve the GPR.

Change is always hard: it’s a cliché, but it’s true. With good feedback from our stakeholders and an embrace of innovation from staff all across CEP, I know we’ll make progress. In my next post I’ll talk about some of the changes we’re making, including a significant revision of the structure and data-presentation in the GPR.

Kevin Bolduc is Vice President of Assessment Tools at the Center for Effective Philanthropy. You can find him on Twitter @kmbolduc. Thanks to CEP Manager Amber Bradley—who leads our GPR process—for contributing to this blog post.

Editor’s Note: CEP publishes a range of perspectives. The views expressed here are those of the authors, not necessarily those of CEP.

From the Blog

A CEP Presence in Europe in 2019
A CEP Presence in Europe in 2019

A decade ago I was managing grants for a nonprofit in Swaziland when the representative of a large funder arrived in my office. Without talking to any of the beneficiaries at the core of the project, he immediately began going through the 500-page grant agreement with...

read more