In the last few months, CEP officially launched the Donor Perception Report (DPR), a donor survey for community foundations and other donation-receiving funders. So far we’ve had eight community foundations participate, including San Francisco Foundation and Chicago Community Trust as the original pilot testers. The next six subscribers, some of which were supported in their participation by the James Irvine Foundation, helped us develop the initial comparative data.
Already, the Napa Valley Community Foundation, which on a number of measures received comparatively high ratings from donors, posted its Donor Perception Report online. It’s from their report that I took the donor feedback above, describing them as “engaged”, “responsive”, “informed”, and “effective.” The foundation’s CEO Terence Mulligan had this to say about the process and report: “Unlike other donor surveys, the comparative data helped us identify the areas of our work that are making a big difference for our donors, and also areas where we can improve.”
But there’s more power in greater numbers. As additional foundations participate, the resulting data will provide an important opportunity to investigate some key questions about donor experience, values, and expectations. Hypotheses abound about what can set community foundations apart in a more and more competitive philanthropic environment, and comparative data will be instrumental in setting the record straight.
We’re already learning a lot. Even though this is just data from eight foundations that we can’t claim are representative of community foundations more broadly, this comparative data is interesting. When we take a preliminary look across these eight foundations, we see two more powerful predictors of donor satisfaction emerge from among other, less powerful predictors:
- donors’ perceptions that the community foundation exhibits a strong community leadership role
- donor satisfaction with the financial aspects of the foundation’s work: both its investment performance and strategy and its administrative fees
I found that latter point fascinating. Donors who thought that either, or both, investments or admin fees “need improvement” rated their overall satisfaction much less positively. But, what surprised me is that they seem to be almost equally powerful in predicting donor satisfaction. Given what I hear from community foundations, I was expecting administrative costs to be the driving predictor. It’s not.
With only the first eight community foundations’ results, these “findings” are better viewed as questions for future analysis:
- Will the same findings hold when we’ve provided feedback for and have data from a couple dozen foundations?
- Is donor satisfaction strongly linked to investment strategy and performance only because we’ve been in a terrible investment environment, or will the same finding hold true in a couple years?
- What do these findings suggest about the tension between donor’s preference for visible, bold leadership and the oft-stated foundation desire to put nonprofits front and center (something donors mention infrequently)?
In the coming months, I hope that readers can help us generate insightful questions to ask of our data as well as encourage community foundations to join this initiative.
****
Kevin Bolduc is Vice President – Assessment Tools at CEP