We are confronted daily with numbers and statements, often conflicting. Take two recent headlines about a daily habit many of us enjoy: one headline from CNN proclaims, “Coffee may come with a cancer warning in California,” while another from The Seattle Times says, “No, your coffee isn’t going to give you cancer.” The Seattle Times article goes on to say that, “Recent headlines suggested that your daily caffeine habit might cause cancer. Turns out that’s not what the science says.”
It’s tough to know what to trust. Sometimes.
Here in Boston, I read headlines every other week touting how reliable our public transportation system is. The articles are perhaps intended to attract companies and bright minds to our city — promising an easy, convenient commute. As daily riders know, nothing could be farther from the truth here. The statistics chosen to tell the story of a well-functioning system are cherry picked and come with little clarity of how terms such as “on time” get defined. For those who have read the bestselling book How to Lie with Statistics, you may remember the iconic cover of a man sweeping numbers under a carpet.
At CEP, our research is one of the core ways through which we seek to help foundations be more effective in their work. In CEP’s nearly 17-year history, we’ve published more than 40 reports on topics ranging from funder-grantee relationships to foundation strategy to performance assessment. (Our reports are available for free download on our website.)
If our research is going to be effective in its mission of providing data and insights that help funders reflect on and make relevant changes to how they do their work, we must hold ourselves to certain standards for every piece we produce. We believe foundations are crucial institutions, and we don’t want to provide conclusions and recommendations to their leaders that can’t be backed by rigorous research.
With that in mind, I’d like to share with you the values that guide our research at CEP, from charting the course for a project, to designing data-collection instruments, to transparently sharing our methodology.
Grounded in What We’ve Learned from Others
The first step in any research project we undertake is to understand what’s already been written or researched on a particular topic. Looking at past literature directly informs our research questions and helps us identify what would help the field move forward.
For example, we are currently conducting research on diversity. While diversity is being discussed widely, and while much has been being written about foundation initiatives focused on diversity, we noticed the nonprofit voice was too often missing from what we were reading and hearing. So, we surveyed nonprofit CEOs to better understand their perspective on how foundations and nonprofits have been interacting on the topic of diversity, and to ask what would be most helpful to them.
Analyzed Using Rigorous Methods and Standards
We want our audience to be able to trust that if they act on CEP’s research, they will not have been led astray.
Many of CEP’s quantitative datasets are often so large that most analyses we run turn up statistically significant results. Among those, some are for very weak relationships, while others are for very strong relationships.
If foundations are going to use our research to change the way they think or make decisions, we want to be sure we are reporting on the relationships that the data indicate are important to focus on — not the weak relationships that, if used to inform change, may lead to no noticeable changes in the effect of foundations’ practices. So, in addition to statistical significance, we also calculate a measure called “effect size.” This measure takes into account the amount of variation in the data and the size of the sample under study to indicate how strong the statistical relationship actually is.
With these methods in place, we tend to report out only those relationships in the data that are of at least moderate strength. And we are transparent about these decisions in the methodology section (and sometimes footnotes) included in each report.
We also stay up to date about practices in survey design. We experiment with new methods in our surveys (sometimes they work for the topics we study, sometimes they don’t). We take the time to use more complicated statistical methods to analyze our data when they will yield more accurate results than simpler methods. We rigorously analyze qualitative data that we collect, rather than reporting out on our impressions of the data. This sometimes means our research projects take longer to complete than they would if we used simpler methods. Our funders have always supported our efforts to do our work in the ways that will yield the most accurate results, even when this means a somewhat longer timeline for completion.
Informed by Experts in the Field
Outside of CEP’s staff, clients, Board of Directors, and Advisory Board, we often reach out to various experts who have deep knowledge either about the content we are studying or the statistical methods we are using. When we believe that collaborating with other institutions will help make a research project stronger, we do so. For example, in 2016, we collaborated with the Center for Evaluation Innovation (CEI) on a project to benchmark evaluation practices at foundations. CEI brought in-depth knowledge about evaluation practices to the table, and we brought a team of researchers skilled in surveying foundation staff and statistical analysis. Together, we created a research report that was, we think, more useful for foundations than what either of our organizations would have created alone.
Communicated with Transparency that Allows for Replicability
In every research report CEP publishes, we include a section that breaks down our methodology and explains in detail, among other things: what parameters shaped the sample from which we collected data; how the data was collected; our response rate and final sample size; for which key variables we see non-response bias (e.g., did we receive a lower response rate from CEOs of private foundations, or CEOs of smaller foundations?); and the analytical methods and decisions we used to examine the data collected. When relevant, we also share information about problems we encountered during data collection and analysis. We believe being transparent about our methodology is key if others are to think critically about what we did and how we did it.
Created to be Useful
Ultimately, our goal at CEP is for our research to be useful to foundation leaders and to inform the important work they do on a day-to-day basis. To that end, we value feedback from funders that makes our research more user-friendly, interactive, and useful. For example, we recently sent out a research report for review so we could collect feedback on its clarity and utility from leaders in the field. Multiple people told us they thought the findings from the report would be more useful if accompanied by clear examples of how foundations are engaging in the work. So, we decided to hold the release of that research report until we can collect and write up such examples.
Feedback is at the heart of so much of our work at CEP. So when it comes to our research, it’s vital that we value and make changes based on what we hear from those whom our reports seek to help. If you’re a foundation leader who has read CEP research, I’d invite you to share your reactions to, or suggestions for, our research in the comment section below, or feel free to email me directly at ellieb[at]cep.org.
Ellie Buteau is vice president, research, at CEP. Follow her on Twitter at @e_buteau.