New Efforts to Improve the Grantee Perception Report

At CEP, we are driven by the belief that feedback can fuel positive change. Since joining CEP I’ve been proud that we walk this talk with our own work; each year — as a part of our internal efforts towards assessment and ongoing improvement — we conduct a survey of funders participating in the Grantee Perception Report (GPR). Last year, we once again commissioned LFA Group, an independent third party, to collect confidential feedback from users of CEP’s GPR.

The results, available on our website here, show that, as in past years, 2011 respondents indicated a high level of satisfaction with the GPR experience overall. I am particularly pleased to see that a full one hundred percent of respondents reported they made changes in their work based on the feedback they received from grantees.

  • 97 percent changed their communications with grantees;
  • 81 percent revised their grantmaking processes; and
  • 34 percent changed their foundation strategy.

As the CEP manager in charge of our GPR process, it is gratifying to see that the GPR continues to have a meaningful impact on foundations’ work.

However, the results are not all what I would have hoped to see. We learned that 2011 respondents’ ratings dropped from past levels on several aspects of the GPR, including the usefulness of the GPR relative to other processes, the value of the GPR relative to its cost, and the clarity of graphical presentation of GPR data. Although there have been ups and downs in some of these ratings over the years, this year felt different. We saw numerous areas drop, and learned that, among the 2011 respondents, first-time users of the GPR tended to be substantially less satisfied with the experience as compared to repeat subscribers.

Needless to say, for an organization that strives for excellence, these drops in ratings are sobering. Personally, the results provide me with the challenge and opportunity that many of our assessment tool subscribers encounter when receiving feedback through a CEP tool: How can I move past disappointing feedback and digest results in a positive, forward-looking way? How can CEP make improvements that are responsive to the feedback we’ve been offered by those who understand our work best?

The results, together with comments from GPR subscribers, suggest that the value and usefulness of the GPR could be improved by a more intuitive visual display of data, and through more responsiveness to the particular contexts of individual funders. The feedback gives us a lot to consider, and we’ll continue to use it for ongoing improvement. At present, we’ve identified two steps we intend to take to respond to recent GPR users’ suggestions:

  1. Improve visual representation of data. Constructive feedback about our charts and report formats is not new, and is something we’ve been considering for a long time (see for example Kevin Bolduc’s previous post). This area has, admittedly, been challenging and is one where we’ve been hesitant to make major changes, in part because some repeat users of the GPR very much like the format. But now, based in part on recent GPR users’ feedback, we have made a firm commitment to getting it done: we will change the presentation of our data, with the goal of creating a more intuitive and accessible display of information.
  2. Increase support to GPR users. Though all 2011 respondents indicated successfully making changes based on their GPR results, we learned that “lack of time” and “unclear next steps” were the most common challenges they faced. In order to maintain our integrity as a neutral third party that remains faithful, always, to the data, it is important that CEP not adopt a traditional consulting role – spelling out must-do steps for assessment tool subscribers. However, we believe we can do more to facilitate data-driven change based on individual GPR results, our field-wide research, and the extensive network of GPR subscribers who have learned from the GPR process. A preliminary step we plan to take is to have follow-up conversations with assessment tool users a few months after the final presentation of their results. Our hope is that this simple step will create an additional opportunity for us to learn from tool users about their ongoing work and the unique challenges they face, and will create further opportunities for CEP to share insights, advice, research and case studies, and connections to other funders. Beyond this, we will continue to look for ways to increase the support we provide to tool users, while maintaining the neutrality and integrity of our assessments.

We’re excited about the opportunity that GPR users’ feedback affords us, and are pushing forward on these changes with the goal of improving the relevance and utility of our work. We will continue to seek input and candid feedback about how we can make all of our work most successful in improving foundation performance.

 

Amber Bradley is a Manager at the Center for Effective Philanthropy.

SHARE THIS POST
, ,
Previous Post
Data Point: What are Foundation CEOs’ Attitudes Toward Assessment?
Next Post
Is Anybody There? Does Anybody Care?

Related Blog Posts

Menu