On June 2, Don Matteson, chief program officer at The Peter and Elizabeth C. Tower Foundation, joined CEP’s Naomi Orensten for an informative and lively webinar about the Grantee Perception Report® (GPR). In the conversation, Don discussed what The Foundation has learned from using the GPR twice — in 2007 and just last fall in 2014 — to inform and track progress in its work, and he shared words of advice for funders considering embarking on the journey of collecting and learning from grantee feedback.
Thank you to Don for sharing his candid perspective on Tower’s experiences surveying its grantees, and thank you to all those who tuned in and submitted thoughtful questions. Here are some of the highlights from the chat.
Additionally, The Foundation has made the report in its entirety public. You can access it on Tower’s website here.
CEP: Could you talk a little about your history of surveying your grantees? What, specifically, led you to using the GPR?
Don Matteson: We’ve always considered ourselves pretty active about going and meeting with our grantees, checking in, and doing site visits. But about 10 years after The Foundation hired its first executive director and staffed up, The Foundation trustees and staff wanted to get a handle on how we were doing. So in 2007, we commissioned a Grantee Perception Report from CEP. That was an interesting experience for us. We had never opened ourselves up that way in the past. We had always heard what grantees thought we wanted to hear, and so having the opportunity to get that feedback in an anonymous format was really valuable to us. With this confidential survey, we heard some things that I’m sure we wouldn’t have heard had we been talking to grantees face-to-face.
A few years later, as The Foundation faced a moment of strategic change, we went back to that Grantee Perception Report as a framework for helping us think about our future work, specifically our increased focus on community impact. With these changes, we decided we wanted to check in and see how we were doing. That compelled us to go back and do another Grantee Perception Report last year so we would be able to compare how we’re doing now to how we were doing seven years ago.
CEP: Can you share a little more about what you learned from your grantee feedback?
DM: In the first report, we learned our grantees didn’t feel like we had a very good understanding of the fields we worked in at the time — developmental disabilities, mental health, substance abuse, and education more broadly. Our grantees felt we didn’t necessarily have a whole lot of impact in their fields and that we didn’t truly understand the organizations we were working with. Giving money is great, but we weren’t having the impact we wanted to have on those organizations. We weren’t really thrilled with that and wanted to see that change.
So, we focused on what we were going to do to address those issues. Our previous grantmaking was very transactional — submit a proposal, we’ll give you some feedback, and we’ll decide to make a grant or not. It was kind of “arm’s length” grantmaking. So we shifted gears. We moved toward a new approach now in which we’re having more conversations, doing environmental scans, and getting out of the office to talk with grantees more.
That shift seems to be bearing some fruit for us. When we got our 2014 results, we saw we had gone from close to the 25th percentile up to the 76th percentile in terms of the way our grantees perceive our understanding of the fields we’re working in. People feel we’re understanding what’s going on and that we’re making more of a difference than we had in the past. That progress was certainly a function of the feedback from the first Grantee Perception Report.
CEP: Could you talk about the value the comparative data added as you were thinking about what your results meant?
DM: Without the comparative data, you’re operating in a vacuum. We really wanted to get a handle on how we stacked up to foundations that do similar work, have similar staffing profiles or endowments, or work in our communities. So when the process began, staff were asked to identify potential funders for our custom comparative cohort. There was a level of comfort knowing we were doing an apples-to-apples comparison. I certainly wouldn’t want to compare Tower head-to-head with Gates or Ford or any of the huge foundations. The comparative data really helps you calibrate your expectations and your understanding of your results.
CEP: What are you doing differently in your own work as a result of the GPR? Is there anything in particular that you’ve decided to focus on or change in your own professional practice?
DM: For the 2014 GPR, in addition to the organization-wide report, we also commissioned subgroup breakdowns so we could see grantee feedback by individual primary contacts. This allowed me to see how the grantees I work with rated me on things like timeliness of response or fairness. This level of detail has been incredibly useful to me. My scope of responsibilities is such that I’m spreading myself pretty thin, so I feel like the grantees I work with sometimes are not getting the timely and responsive feedback that they deserve. That’s something that I am working on in my own practice. Additionally, these individual breakdowns mean I can talk with each program officer about what we would like to do in the next year to help build our individual skills in certain areas so that all of us can be more effective in our work.
CEP: Do you have any advice for funders considering using the GPR?
DM: The biggest piece of advice — which may be the hardest piece of advice to take — is to put ego aside and take the opportunity to truly listen to what grantees have to say. We had to stop being defensive about things we were hearing and stop saying, “We’re doing it right, you just have to follow our directions,” and say instead, “Okay, let’s truly look at this from the grantee perspective. What is this really telling us?”
Instead of seeing things that are critical and dismissing them, you have to overcome that initial visceral reaction and use the opportunity to embrace the feedback and consider whether there is something you can do to address that issue. Be prepared for whatever feedback you get — some of it will be good, but some of it will not be as happy as you like. You have to put the ego aside and take the bad and the good in equal measure.
CEP: Anything else you want to share?
DM: Using the GPR was a really positive experience for our Foundation and for me, even in 2007 when the results were not as positive as we would have hoped. We have been able to make changes that really seem to have improved our abilities to help the communities we are serving.
The Peter and Elizabeth C. Tower Foundation, located in Getzville, N.Y., supports community programming that results in children, adolescents, and young adults affected by substance abuse, learning disabilities, mental illness, and intellectual disabilities achieving their full potential. Follow The Foundation on Twitter at @towerfdn, and follow Don at @towerdwm.