At CEP, we are in the business of gathering feedback. A significant part of our work involves helping philanthropic funders gather comparative feedback from key stakeholders.
But we need feedback on our performance, too, and so we regularly seek to understand how users of CEP’s assessment tools perceive us – and whether or not they are making changes in response to the findings they receive. After all, if our tools do not result in meaningful change, what’s the point?
So last year, we commissioned an independent evaluator (LFA Group: Learning for Action) to survey 2009-2010 subscribers of our Donor Perception Report (DPR), a newer tool that aims to help community foundations understand how they are doing in the eyes of their donors. The DPR seeks to put community foundation performance as judged by donors in a comparative context, just as CEP’s Grantee Perception Report (GPR) does for grantee perspectives – highlighting both relative strengths and areas for improvement. Since 2009, over 30 community foundations across the U.S., large and small, ranging from the Chicago Community Trust to the Dallas Foundation and the Napa Valley Community Foundation, have participated in a DPR.
As the manager at CEP responsible for this tool, I was eager to better understand whether the tool was making a difference. LFA focused on foundations that had used the DPR at least a year prior to the survey, in order to gather data on what changes had been made, so the number of foundations studied was small – just 11. Nonetheless, the findings give us an early indication of the tool’s utility. Here is some of what we have learned about this tool from this early assessment:
- Satisfaction: Respondents indicated a high average level of satisfaction (at least a 6 on a scale of 1 to 7, with 1 being “not at all satisfied” and 7 being “very satisfied”) on five out of six key areas of their experience, including the DPR experience overall and the extent to which the DPR helped deepen the foundation’s understanding of its donors’ needs/interests. Specific themes that emerged from the interviews were high satisfaction with the design of the donor survey instrument and with the resulting report. (See page 6 of the report)
- Creating Change: 100 percent of respondents reported that the DPR drove some changes in their work, with 55 percent indicating they made “significant change” in at least one area of their foundation practices or strategy. Most subscribers reported at least some change in their approach to working with current donors, engagement of new donors, foundation strategy, and collaboration among donor staff and others in the foundation. (Page 13)
- CEP Staff Helpfulness in the Process: On a 7-point scale (where a score of 7 is associated with “very responsive” or “very helpful”), the mean score for responsiveness of CEP staff to questions was 6.7, and the mean score for helpfulness of CEP staff responses was 6.9. (Page 7)
- Opportunities for Improvement: Subscribers shared several ideas for how to enhance the value of the DPR experience, including:
- increasing the size of the comparative data set;
- having CEP provide a “refresher” or continued access to comparative data as new community foundations are added;
- increasing in-person contact and dialogue around how to implement changes suggested in the findings;
- developing more examples of how other similar community foundations had successfully implemented changes based on DPR results. (Page 7)
At CEP, we are committed to seeking feedback on an ongoing basis to help improve our processes. We are very gratified that this initial feedback about the tool and CEP’s work with DPR users were both viewed quite positively. Still, we have begun to address several of the areas for improvement for this year’s subscribers, and we will be commissioning a similar assessment next year in our effort to continue to learn and improve.
Grace Nicolette is a Manager at CEP.