This Giving Season, improve your effectiveness as a donor with CEP’s resources for individual givers.

Contact Us

Search

Blog

Struck by Duck: Do We Really Need This Data?

Date: October 20, 2015

Kevin Bolduc

Vice President, Assessment and Advisory Services, CEP

Never Miss A Post

Share this Post:

  • Struck by duck (W61.62)
  • Sucked into jet engine, subsequent encounter (V97.33XD)
  • Pedestrian on foot injured in collision with roller-skater (V00.01)
  • Bitten by pig, initial encounter (W55.41XA)

If you’re Facebook friends with any physicians, your newsfeed probably started filling up with statements like these on October 1. Why? Because that was the day on which medical coding systems were forced, after years of planning and delay, to migrate from ICD-9 to ICD-10.

ICD-9, the International Classification of Diseases, 9th Revision, had a trifling 13,000 codes medical providers used to code all patient diagnoses. The newly-introduced ICD-10 sought to provide a more logical, detailed, and comprehensive system. So, 13,000 codes expanded to about 68,000, including those above. There’s an accompanying 115-page CDC guide, just about how to use the codes (and a very unofficial 72-page illustrated guide).

That’s a lot of hyper-specific data collected by a lot of very busy, highly-trained people.

Why is this relevant to philanthropy? And why am I writing this post — other than the fact that it’s just fun to type things like “Struck by turtle (W59.22).” (I mean, honestly, a turtle is the epitome of slow. Were you just not paying attention when the turtle came at you? Or was someone wielding that turtle as a weapon?)

The relevance lies, I think, not with confrontations with squirrels (W53.21), macaws (W61.12), and, orcas (W56.22), but as a reminder to be thoughtful about the scale, scope, and cost of data creation and collection. It’s very easy to lose sight of those fundamentals in the heady, theoretical moment of design, and it’s tempting to try instead to capture everything you might ever need, increasing the effort, cost, and, subsequent moral responsibility to use all that information collected. For example, will this code ever be worth the cost it took to create, document, and build it into innumerable medical systems: “Unspecified spacecraft accident injuring occupant (V95.40)?”

Whenever I have the chance to speak with funders about the design of a scorecard or dashboard, a potential assessment, or a grantee application or report form, I encourage them to carefully consider a couple things: What is the specific purpose of the data? Who is its primary audience? (Funder board? Staff? Grantees? Other stakeholders? The general public?) And what is the ultimate target for how the data will be used for learning and to adjust funder strategies and activities?

I asked my husband, an oncologist at an academic medical center, whether he is directly using any of the ICD-10 information to learn something about his practice. He tells me that eventually doctors might get some useful comparisons of the codes they use versus those used by other doctors. But for now, those codes are mostly getting sucked into the abyss of medical records. And because no one is really sure how the coding info will be used by powerful insurance companies, there’s also considerable worry among his peers that those payers will use it to find new reasons to deny payment or require more paperwork from already stretched physicians. (There is, at his organization, however, a good prize for the first person to use “Burn due to water-skis on fire – V91.07.”)

This lack of meaningful understanding of how information might be used or misused isn’t so far afield from some grantees’ experiences when it comes to reporting. Many grantees continue to think that grant reports just disappear into a black box — or worse, a fiery furnace.

Applying for and reporting on grants often requires gathering lots of different pieces of data and making them fit particular funder-prescribed formats. The typical grantee surveyed as part of CEP’s Grantee Perception Report indicates spending about 30 hours over the lifetime of their often one to two year grants on application and reporting requirements (a figure that rises to about 50 hours for grantees of the larger funders in our dataset). Yet only about 50 percent of grantees report that their funder discussed with them the information they submitted — even though, as CEP research points out, those discussions are one of the clearest actions a funder can take to make a reporting process helpful in strengthening grantees.

And what about the cost of creating data? I’ve written about our shuttered Strategy Landscape Tool, which was a collaboration between CEP, Monitor Institute, and dozens of funders to provide interactive mapping that openly shared information about grant-level strategies, tactics, grantees, geographic targets, beneficiary information, and so on. We and the funders we were supporting wanted the system to track a bunch of information that might be relevant for better understanding overlapping strategies and, therefore, the potential for improved or collective strategies. The online mapping was sophisticated, but after the initial burst of energy as funders coded their grants data and strategies, the burden of updating all that information became too much. “Put it all together,” I wrote in 2013 as we ceased providing Strategy Landscape tools, “and we learned that this was too complex and customized a tool — and therefore too expensive and time-consuming — for what most funders want right now.” We could have made cost and day-to-day utility more prominent in our upfront planning for this work.

Look, I love data, and I believe it’s a crucial component of effective philanthropy. We see examples of great funders that use the reporting information they collect thoughtfully. Likewise, I get that the ICD-10 data will likely be useful to epidemiologists and public health officials. (At the very least, we’ll soon know the aggregate incidence and cost of duck-human striking across the entire country.)

But as you think about designing your next data collection initiative, be sure to keep focused on the fundamentals: purpose, audience, utility for learning and action, cost.

Data that is broadly relevant.

Data like this ICD-10 code: “Problems in relationship with in-laws (Z63.1)”

Kevin Bolduc is vice president, assessment tools, at CEP. Follow him on Twitter at @kmbolduc.

Editor’s Note: CEP publishes a range of perspectives. The views expressed here are those of the authors, not necessarily those of CEP.

From the Blog