This Giving Season, improve your effectiveness as a donor with CEP’s resources for individual givers.

Contact Us

Search

Blog

An AI Roadmap for Philanthropy in 2025

Date: December 4, 2024

Jean Westrick

Executive Director, Technology Association of Grantmakers

Never Miss A Post

Share this Post:

As grantmaking organizations increasingly explore how AI tools can transform the way we work in civil society, the Technology Association of Grantmakers (TAG) recently released results from a global survey of grantmakers in our 2024 State of Philanthropy Tech report. Reflecting the technology and operational reality for more than 350 foundations globally, the report offers data to reflect upon our current state and chart a responsible course for AI in 2025 and beyond.

It is clear from the findings that while AI experimentation is high, both guardrails for usage and enterprise-wide integration are still nascent. Here are a few highlights:

  • 81 percent of foundations report some degree of AI usage; however, enterprise-wide adoption remains emergent, with just 4 percent of respondents indicating AI usage across the entire organization.
  • Only 30 percent have an AI policy in place while just 9 percent have both an AI policy and an AI advisory committee.
  • For those grantmakers leveraging AI, most report a rudimentary usage of generative AI for tasks such as meeting notes/video transcription (71 percent) or drafting report, emails and memos (67 percent).

Reflecting upon these findings, I find myself concerned. As society grapples with AI, I believe in the vital role of our leadership as grantmakers. Yet the lack of formalized structures to guide AI usage — including policies and advisory committees — raises questions about our readiness to leverage AI responsibly.

Barriers to Broader Adoption

In addition to AI usage and governance, the State of Philanthropy Tech report identifies several barriers to more meaningful and strategic adoption of AI. Chief among them are privacy and security concerns (55 percent), followed by a lack of necessary skills (43 percent) and uncertainty about relevant use cases within philanthropy (40 percent).

Foundations are, understandably, approaching AI with caution. As grantmakers, trust is our currency; building and maintaining that trust is critical to our work. While we steward financial resources, we also steward sensitive data on behalf of our nonprofit partners and must maintain the highest ethical standards. Concerns around bias, fairness, transparency, and accountability must be central to any responsible adoption strategy.

Fundamentally, these concerns are about data — and unfortunately, as a sector, we have largely underinvested in our data frameworks. Responses to the State of Philanthropy Tech report underscore this as well: only 45 percent of grantmakers reported having a data privacy policy, and only 46 percent had a data retention and destruction policy. Data is a valuable and essential asset. Good AI requires good data — shoring up how organizations collect, manage, and maintain data will be key. 

Furthermore, while many philanthropy professionals are experimenting with AI, grantmaking organizations have clearly identified a need for more knowledge and skills when it comes to leveraging AI. To address this need, TAG is launching a learning exchange in 2025 for member foundations seeking to build a strategic approach to AI, to share how they are leveraging sector-specific applications of AI, and to learn from peers.     

From AI Experimentation to Strategy: 3 Things to Do in 2025

As a sector, we have a growth opportunity to transition intentionally from experimenting with AI to developing a strategy for operational effectiveness, impact innovation, and nonprofit support. For philanthropy, this means investing not just in AI technology but also in the infrastructure — both human and technical — needed to support its responsible use. Here are three ways to get started in 2025:

  1. Set guardrails for responsible AI use as well as cross-functional ownership.
  2. Invest in cross-organizational change management efforts, building awareness, skills, and feedback loops.
  3. Commit to building a strong data culture; invest time and resources into managing and maintaining your data assets.

Establishing AI policies and cross-functional advisory committees should be a priority for every foundation. As a baseline, this will allow us to ensure privacy and safety while navigating the ethical dilemmas inherent in AI adoption. If you’re part of the 9 percent of philanthropy with both policies and oversight in place, 2025 should be a year for training and awareness as well as culture-building to innovate safely with the more strategic uses of AI.

A Call for Leadership in AI

Over the last two years, the emergence of generative AI applications, like OpenAI’s ChatGPT, Microsoft’s Copilot, or Anthropic’s Claude, has been so rapid paced, it is hard to remember another technology having such widespread adoption in such a short period of time. It is still early days. But AI is already shaping the ways we work and bringing with it opportunities and new anxieties. The path to realize the full promise of AI mustn’t be marked with peril, however. Philanthropy is uniquely positioned to lead, but we must do so with intentionality. The challenge ahead is not purely technical; rather, it is a uniquely human one.

We have entered an era when technology is no longer just a set of tools but a potentially disruptive force — one that calls for informed leadership to reduce harm and amplify positive impact. We have an opportunity to reimagine our internal organizational culture, to fortify a greater commitment to learning and impact in service of our communities, and to live deeper into our shared values of generosity, commitment and more common good.

Editor’s Note: CEP publishes a range of perspectives. The views expressed here are those of the authors, not necessarily those of CEP.

From the Blog