This Giving Season, improve your effectiveness as a donor with CEP’s resources for individual givers.

Contact Us

Search

Blog

Turning the Tables: Assessing Our Own Assessment Tool

Date: November 18, 2014

Amber Bradley

Director, Learning Institute, CEP

Never Miss A Post

Share this Post:

Gathering, analyzing, and acting upon feedback are key priorities here at CEP. These are the values that have driven us to develop our assessment tools, publish research reports that draw on the voices of key foundation stakeholders, and devote our energy to helping funders collect and understand comparative feedback to reflect on and improve their effectiveness.

We need feedback, too, to improve what we do. To that end, for the eighth time since 2005, we have contracted a third-party evaluator, LFA Group: Learning for Action, to survey recent Grantee Perception Report (GPR) subscribers about their experiences and overall satisfaction with the tool. The full report from LFA is available on our website.

Since 2012 we’ve made several major changes to the GPR, including:

  • Redesigning the presentation of our data with new, more intuitive charts
  • Developing an online reporting system that allows users to
    • interact with their data,
    • see cuts of their results by different subgroups of their choosing, and
    • compare their results against nine different cohorts of funders (including their own custom selection) from our ever-growing dataset of more than 250 participating foundations.
  • Updating the report to include more direct recommendations based on a GPR user’s specific findings, our broader research, and our 12 years of experience working with funders to gather and act on grantee feedback.

Two years down the road, we were eager to see what 2014’s report yielded.

We were therefore pleased to learn that this year’s GPR user survey showed upward trends from the dips in 2011-2012. LFA surveyed 24 funders that commissioned a GPR between May 2013 and February 2014 regarding their satisfaction with the GPR tool and process, the perceived value of the GPR, and changes they made inspired by the GPR. Here’s what we learned from the 16 funders who responded:

When it came to overall satisfaction with the GPR experience, 2014 respondents reported a very high level of satisfaction (averaging 6.1 on a scale of 1-7, where 7 = “very satisfied”). This was a trend up from ratings from 2011-12. And we saw notable improvement in users’ perceptions of the GPR’s value relative to its cost. In particular, for first-time GPR subscribers we saw a statistically significant improvement in perceptions on this metric – with a 2014 average rating of 6.3, where 7 = “excellent” value for the cost.

As for changes inspired by GPR results, we were pleased to see that all 2014 respondents reported making a change in at least one area of grantee engagement after commissioning a GPR. Every respondent reported making a change in their communication with grantees, while 77% of respondents reported making at least some change in their provision of assistance to grantees “beyond the check.” Because grantee feedback is only useful if it’s used to learn and improve, we’re especially glad that users have historically, and continue to, take action based on the results of their GPRs.

We were eager to hear initial feedback about the utility of the new online reporting system, one of the largest changes we’ve ever made to our assessment tools. Overall, results showed that the services and features of this new online format were well received. Ratings for the clarity of data charts and graphs trended upward compared to 2011-2012, and the interactive online report received strong feedback, with its helpfulness rated a 5.9 on average.

As always, we are committed to continually improving our work. Among 2014 respondents’ suggestions for improvement was the request that our reports and presentations offer still more specific and concrete recommendations. We took this as a positive sign that we’re moving in the right direction by providing recommendations beyond just neutral synthesis of the data. And we’ll continue to push further.

In addition, based on users’ feedback we’re also working to:

  • Continually improve the online system, including the look and feel, navigability, print functionality, and applicability to funders of all scopes and sizes.
  • Ensure that our assessment tools are as actionable as possible, for example by connecting funders to relevant research and case studies when applicable.

I’d like to thank recent GPR users for taking the time to provide us with their thorough and candid comments. We’re extremely grateful for our audience’s diligence and care in helping us hone our work. As we have done in the past, we plan to learn from these perspectives and continue to improve. It’s a process that my colleagues and I are very much looking forward to.

Amber Bradley is Director of Assessment Tools at the Center for Effective Philanthropy.

Editor’s Note: CEP publishes a range of perspectives. The views expressed here are those of the authors, not necessarily those of CEP.

From the Blog