There has been lots of talk lately about foundations and “transparency.” Perhaps the most prominent foundation transparency initiative is a website called Glass Pockets, launched about a year ago by Foundation Center. Its tagline? “Bringing transparency to the world of philanthropy.”
CEP worked as one of a number of partners to Foundation Center as it designed this effort, and we were pleased and proud to do so. We favor the general idea of foundations being more open about what they do and how they do it, and I agree with Foundation Center President Brad Smith that the record of foundations overall is not so hot when it comes to transparency. So I applaud the effort Foundation Center is making, under Brad’s leadership, to promote transparency.
In fact, as I prepared for the event on “Transparency and Effectiveness” that Foundation Center and CEP co-hosted in San Francisco this week, I started to worry about whether we’re going nearly far enough. I started to worry about whether we are settling for superficial transparency, about whether the inevitable push to create simple check-lists is resulting in good information – or misinformation.
I worry that, in the worst case, we could witness the kind of counterproductive behavior that we see in the world of colleges and universities, where preoccupation with placement on the U.S. News and World Report rankings has led some institutions to take steps to improve their standing – to game the system, really – that are counter to what is best for their students. I worry that we’re not discussing critically the information foundations are being transparent about – and I am not sure there is value in transparency without engagement. And I worry that we’re overemphasizing transparency about trivial matters and under-emphasizing transparency that really facilitates greater effectiveness.
Which brings me back to Glass Pockets. On the Glass Pockets website, 32 foundations to date are listed under the question “who has glass pockets?” because they have agreed to be assessed on whether they disclose information in categories such as “Basic Contact Information,” “Governance Policies & Information,” “HR/Staffing Policies & Information,” “Financial Information,” “Grantmaking Information,” and “Performance Measurement.”
Some don’t do too well – turns out some of their pockets may be more of a tinted glass.
But what about those who do perform well? What about those receiving magnifying glasses or bullhorns on the site (the icons Foundation Center uses to give credit, and which link to the supporting material) in the various categories as indications of their transparency? What does it take, exactly, to get credit in the various areas?
I don’t see the categories as even close to equally important. So I was particularly interested, given CEP’s focus, in foundations that received magnifying glasses on the four dimensions of the “Performance Measurement” category. The specific dimensions in that category are: “assessment of overall foundation performance,” “knowledge center,” “grantee feedback mechanism,” and “grantee surveys.”
Much of what I found was less than inspiring. One foundation receives a magnifying glass for “overall foundation performance assessment” for a consultant’s report based on interviews with leaders in the community – with absolutely no information about methodology or number of interviews conducted. Another foundation links only to the Grantee Perception Report (GPR) provided by CEP. While I think the GPR can be an important component of an overall assessment, should it be enough, on its own, to qualify as an “overall foundation performance assessment?” I am not sure.
One foundation that gets credit on Glass Pockets for conducting a grantee survey does so because it self-administered a survey several years ago and got a 22 percent response rate – and what that corresponds to in terms of number of responses is not disclosed. The questions are poorly constructed, with response option choices that virtually guarantee positive results – and there is no comparative data. Yet among the “findings” are that there are “very favorable views of [the foundation] in all areas” and that the donor “would be proud!” I wonder. (I am not exactly objective, here, I realize. We have our own approach to grantee surveys and a particular point of view about what constitutes a rigorous survey effort.)
Another foundation gets credit for having a “grantee feedback mechanism” simply because it posts contact information for a staff member that grantees can reach out to.
Some of the foundations that do well on Glass Pockets absolutely deserve to. There are funders with impressive supporting materials in the “Performance Measurement” category. But it’s impossible to distinguish between them and others whose commitment seems more cursory based only on what’s listed on Glass Pockets. You have to go to the foundations’ web sites to figure it out, and I am not sure how many of the visitors to the web site will take the time to do that. I think the natural tendency will be to assume that those receiving credit for doing performance measurement, and being public about it, are doing something really meaningful.
So what should be done? A few ideas:
- I would like to see Glass Pockets add a space for comments on the quality of what foundations have been transparent about. If we don’t engage with what foundations are making public, if it just sits on websites, then how valuable is it?
- I would like to see more discussions about the substance – both online and face-to-face at meetings such as the one we held in San Francisco. Is transparency a good in its own right? Or is it a means to an end of greater effectiveness? Are transparency and effectiveness ever in tension?
- I’d like to see Glass Pockets do more to push for openness about programmatic effectiveness – so foundations are encouraged to disseminate information about what works and what doesn’t. That seems far more important to me than criteria such as whether a foundation has a staff list on its Web site, or a newsletter, or a blog.
- I also wonder about the potential importance of the Charting Impact initiative developed by Independent Sector, BBB Wise Giving Alliance, and Guidestar USA. This effort encourages organizations to publicly answer five “deceptively simple questions,” including “What is your organization aiming to accomplish?, What are your strategies for making this happen?, and How will your organization know if you are making progress?” Only one foundation, the William and Flora Hewlett Foundation, is listed on the IS Web site as having completed a Charting Impact Report. (Full disclosure/hypocrisy check: CEP has not yet done this, although our answers to essentially the same questions can be found in our publicly available strategic plan.) Transparency (and clarity) about the answers to these questions on the part of all major foundations would be a step forward.
I think the Glass Pockets effort is a promising one. Foundation Center deserves a lot of credit for taking a stand and pushing this forward. I also like the direction of the Charting Impact initiative.
We should all work to raise the bar on what constitutes transparency that matters. I hope we can push for much, much higher standards for the depth and quality of information necessary for a foundation to get credit for its commitment to performance measurement, for example – or, for that matter, for its transparency in general.
Foundations have made progress when it comes to transparency. But there is a long, long way to go.
Phil Buchanan is President of the Center for Effective Philanthropy.