Assessing the Assessment: The Grantee Perception Report

Feedback really does fuel change. That’s why we created the Grantee Perception Report (GPR) – to allow funders to hear from grantees and continually improve based on their unbiased, comparative feedback. And that’s why it’s important that funders actually can, and do, improve!

It’s also why we try to do the same, periodically using a third party to ask users of the GPR (and our other tools) about their experience working with CEP, the changes the GPR has influenced, and what they think we can do better. Today I want to share some reflections from Learning for Action’s 2015 assessment of our Grantee Perception Report.

The GPR is CEP’s mostly widely commissioned assessment, now used by forty to fifty funders a year and almost 300 over time. We commission annual customer-feedback about this assessment to understand the changes funders are making as a result of using the GPR and to ensure that we are continually improving it.

Some Positives

We were very pleased this year to see that the results of the 2015 survey of GPR users were quite positive overall, and in a number of important areas trending up from the past feedback. We were pleased that ratings of client satisfaction and sense of the GPR’s usefulness relative to other processes for measuring overall foundation effectiveness were strong and among those trends up. One hundred percent of respondents said they would recommend the GPR. And, perhaps most importantly, respondents reported making changes in their work in response to GPR findings: Just in the area of communications with grantees, 95 percent of respondents made changes. And greater proportions than ever cited changes they made in providing assistance beyond the grant and in their grantmaking processes, among other areas.


Some Opportunities for Improvement

Last year, my colleague Amber Bradley wrote about a few pieces of feedback we’d received through the 2014 assessment. GPR users suggested that we could strive for more concrete recommendations in addition to neutral diagnosis, and they provided very helpful feedback on look, feel, and navigability of our still relatively new online reporting system.

We took those suggestions to heart and made some changes in our work.

Based in part on that 2014 feedback, we accelerated a significant upgrade of the online reporting system through which we share results with clients. The intent of that system is to create a valuable management tool that displays overall survey results, allows for the segmenting of results by various subgroups of GPR respondents, and provides comparisons to more than one cohort of funders. This 2015 feedback, happily for us, suggests those upgrades are useful, with GPR users rating more positively than in the past for the utility of the GPR on its own (without CEP interpretation) and for the extent to which the data and charts are clear.

That’s progress toward our ultimate goal, which is to ensure that the GPR isn’t just a moment in time assessment but is also an ongoing tool that funders can reference on their own as they encounter new questions over time about how to best structure their work.

In response to past feedback about the utility of our narrative summaries and recommendations, we also instituted changes. On that front, though, we were less satisfied with what we heard in 2015. While we saw some trends up in the already fairly positive ratings about the extent to which the GPR highlights significant strengths and opportunities, ratings did not rise for the utility of our narrative memos. That tells us to try again. We will – shortening the narratives and providing crisper recommendations based on specific GPR findings, our experience with what has worked for other funders, and our ever growing research and knowledge base.

We’ll also be taking a close look at how we use our precious in-person time with funders as we discuss and present results. Ratings of the quality and usefulness of CEP’s presentations of GPR results are high, but we noted a slight trend down that spurs us to reflect on how we can ensure that we’re always presenting results in ways that strengthen foundation staff’s ability to understand the data fully and draw on CEP’s broad experience working with other funders.

I hope you’ll take a read through these results. And, for those of you that have used the GPR, thank you for your responses. They were very helpful to us – as they always are.

Kevin Bolduc is vice president, assessment and advisory services, at CEP. Contact him at, and follow him on Twitter at @kmbolduc.

assessing performance, , Grantee Perception Report
Previous Post
Listening to — and Acting Upon — Beneficiary Feedback: A Story from High Tech High
Next Post
Finding Out Whether Your Assistance to Grantees Makes a Difference

Related Blog Posts