SHINE is an education charity operating in the North of England. Between their first engagement with CEP’s GPR in 2017 and their second in 2022, SHINE made some major changes, moving their headquarters, shifting their strategy, and building new and deeper relationships. To find out more about how the GPR influenced theses shifts or helped the organization understand the effects of these changes on grantees, I had the chance to chat with Helen Rafferty, interim CEO of SHINE.
Chloe Heskett: This was SHINE’s second Grantee Perception Report (GPR), with the first taking place in 2019; why was it important to the Trust to do another GPR? Were there any particular considerations or questions you wanted to address in the second, more recent survey?
Helen Rafferty: The first GPR caught SHINE in a pivotal moment in our development as an organization. In 2017, we substantially changed our focus, and indeed the location of our headquarters, to better serve disadvantaged communities and stakeholders away from London. Our shift to a new base in the North of England and away from many established relationships was reflected well in the GPR. We heard that organizations in our new base experienced positive relationships with SHINE, but were perhaps more tentative about our impact on their organizations or our understanding of their communities. Partners in London reflected confidently on the relationships we’d built there, but understandably felt less sure about our new funding focus and their alignment with our priorities, given that we’d moved away from continuing to support their work or their area.
For us, it felt critical to follow up again with grantees in the North of England having been established here for a few years, to understand the impact we were making and whether we’d laid the right foundations and built the right kinds of relationships in our time here. It was key to us, therefore, that we asked the same questions as in the first GPR, to understand our growth and the depth of our roots here, and of course hear about how we might improve further.
One particularly interesting insight that came out in the first GPR was that we were rated less favourably by female grantees than male. As an organization with a female CEO and substantially more female staff, we spent some time reflecting deeply on this finding, and exploring possible ways in which our approach might create this disparity. We were pleased to find that this didn’t come up in the second GPR survey. Without this follow-up, we’d not have had a way to know that our reflections and some changes in approach had been successful in addressing this!
CH: In the reflections you shared on the SHINE blog about your GPR, there’s a clear emphasis on action. Can you share how those actions were decided upon — did the data you received in your GPR influence those focal points?
HR: Our first GPR was probably an exercise in curiosity: we wanted to understand in depth how our grantees saw us, and to know more about our standing and relationships. This time around, I was determined that any request for feedback should be a genuine exercise in listening and responding, if we were to continue to build foundations in the North of England. For grantees, filling in the survey is a time commitment, and from the outset I was keen that we recognized the value of what our grantees were willing to do for us and tell us, and that we were authentically prepared to adapt and change our practice in response to the feedback. In a sense, anything other than a commitment to action would have felt like either conducting a GPR as a vanity exercise, or would have meant cherry-picking which feedback we found valuable, and what we thought was worth hearing. So, the emphasis on action was built into the process from the outset for us.
Then, the focal points for action were directly decided based on the results of the GPR. Where we received positive feedback from our grantees, we discussed ways we could keep doing things that had likely resulted in this positive feedback, or build on the good parts of our practice further. Where we heard that we compared less favorably than other grantmakers or scored less well overall, we directly discussed what it was in our power to change (given our current resources and processes) and what we could commit to improve the experience for our grantees.
We do recognize that there’ll always be an inherent power dynamic between a grant-maker and grantees, but we also want our brilliant grantees to feel they are part of a SHINE ‘family.’ We can’t meaningfully build this unless we’re willing to let our grantees directly influence our decisions and actions.
CH: When you and I talked about SHINE’s post-GPR experience a while back, you mentioned that there was a carefully considered process of sharing the survey results with grantees; can you share more about that? Why did the Trust feel it was important? Have you gained any additional insight into your grantees’ experiences through this process?
HR: Sure, I suppose this plays into the same sense that we can’t hope to ever have our grantees feel as though they are part of any kind of SHINE ‘family’, or demonstrate that we value the time they take to give us feedback and insights into their experiences unless we’re happy to then share what that feedback is, and be open about the information we’ve learned. Specifically, in their projects, I think we ask a lot of our grantees in terms of reporting and accountability, but also reflection and vulnerability. We ask grantees to tell us what hasn’t worked, and how they hope to adapt, as much as we ask them to tell us what has worked. I wanted SHINE to be just as willing to go through that process in response to feedback.
CH: We also talked a little bit about benchmarking — both against peer grantmakers and against your own past results. What was helpful to the Trust about being able to compare the most recent survey results with others’ and with your own past data?
HR: As mentioned, for us, comparing to our own past data was essential in confirming whether we’d been successful in making some specific changes in response to the first GPR. Comparing to our own past data helped us understand our direction of travel as an organization, and work out if we’d been successful in addressing some specific barriers and results.
SHINE is possibly an unusual organization in that nearly all of our grantees come from a specific type of organization within a specific sector, within a defined geography. To be (even more) specific, nearly all of our grantees are qualified teachers, currently or by background. With this in mind, it was sometimes difficult to identify the best comparator organizations as we couldn’t be sure that others’ grantees would share the same distinctive traits or come from the same professional background. That said, it was still incredibly helpful to understand how our scores placed us against the general backdrop of organizations with a similar focus or of a similar size. It helped place the scores in context and give us a better sense of our standing within the wider world of grantmaking. For some questions, for example the amount of time our grantees committed to our reporting processes on average, we simply couldn’t have known how this compared more broadly without an external benchmark.
CH: The Trust received their survey results in November. As we’ve talked about, you’ve shared those results with grantees, and you’ve shared both reflections and planned actions in a recent article on SHINE’s website. What comes next? Are the survey results filed away until next time?
HR: CEP have been incredibly helpful in linking us to other partners who have experience responding to and following up from the GPR. We haven’t actioned this yet, but we are planning to look into whether there are further deep-dives or perhaps even lighter-touch surveys we can conduct with grantees in the interim, to understand whether we’re making the right decisions, and responding to the GPR with the right actions. Considering the reporting example again, it would be helpful to know more about our grantees perceptions of our reporting processes — the helpful and not-so-helpful parts — and make sure that any changes we make are responsive to that. In particular, we’d be interested in understanding how our commitment to providing at least 5 percent of the project budget to the time needed for evaluation has impacted on our grantees time, outputs, and experiences.
Other results may take longer to be reflected in grantee perceptions. For example, we’ve incorporated some of the feedback from the GPR directly into our current discussions about reviewing our strategy for the coming years. A follow-up GPR in another three years would feel like the right way to understand any longer-terms shifts in perceptions that have arisen from that. In a sense, yes, the results will be filed away, but only until we’ve got that longer-term comparator for the impact of bigger steers.
CH: Is there anything else you’d like to share about SHINE’s experience with the GPR?
HR: Just that it has been an incredibly positive experience for us, on both occasions, providing a quantitative overview and a depth of insight that I don’t think we could have gained by any other means. The process has given us much fertile ground for deep discussion and reflection on our practice and relationships as a grantmaker. I’d encourage other grantmakers to look into it!
Chloe Heskett is writer and editor on the programming and external relations team at CEP. Find her on LinkedIn. Helen Rafferty is interim CEO at SHINE. Find Helena on LinkedIn and follow SHINE on Twitter.