Reach out now to receive a discount on a 2025 CEP assessment or advisory project.

Contact Us

Search

Blog

From Keeping Score to Measuring Progress: One Foundation’s Approach to Self-Assessment

Date: November 1, 2018

Never Miss A Post

Share this Post:

As the calendar year comes to an end, it’s performance assessment time for employees at many organizations. Did I meet my goals for the year? Where did I do well? Where might I have fallen short? What could I do differently to improve my performance next year? Many foundation leaders are asking these questions on an institutional level as well, looking back and considering the impact their organization has had over the past year.

Assessing Organizational Performance

For many years at the New York State Health Foundation (NYSHealth), this process resulted in the production of an annual “Scorecard” that we shared publicly on our website. The Scorecard included a range of metrics meant to provide a high-level overview, both internally and externally, of our progress toward our most important goals and priorities. Internally, the Scorecard served as a touchstone for staff throughout the year.

As the old saying goes, “What gets measured gets done.” If staff knew that they would be publicly accountable for a goal through the Scorecard, they would make it a priority to meet or exceed expectations. In program review meetings discussing potential projects, the idea was that staff could look to the Scorecard to stay focused. If a given project didn’t advance our goals, maybe it wasn’t the right fit for us.

The Scorecard was also an important element of NYSHealth’s commitment to being open and transparent. Foundations have a unique freedom to experiment and to fail. We believe that with that freedom comes a responsibility to share what we learn, and to be honest both about what’s working and where we fall short. Over the years, we’ve seen numerous foundations commit to transparency in sharing performance data in different ways. Some share the full results of their CEP Grantee Perception Reports on their websites; others commission in-depth progress reports.

We generally received positive feedback about the Scorecard from external audiences, particularly from other funders who were impressed by the level of detail and transparency. But internally, we were grappling with some existential questions about it. Were we measuring the right things? Were we too narrowly focused on numeric targets? Were we providing appropriate context for our work? Were we being honest enough about our shortcomings?

We remained committed to measuring our progress, being transparent, and holding ourselves accountable for our work. But we needed a fresh approach to operationalize those principles effectively.

Measuring Progress Versus Keeping Score

In 2017, NYSHealth began to build up an internal policy and research team for the first time, which presented the right moment to take a hard look at our Scorecard. One of the first decisions we made was to move away from the nomenclature of “Scorecard,” which can have a punitive connotation. While we believe it is appropriate to view the document as a “north star” — i.e., a constant reminder of what we are trying to achieve and how much work needs to be done to get there — it is not a tool for rating individual staff performance. We changed the name of the product from “Scorecard” to “Progress Report” to help ensure the document was viewed as aspirational in nature, as opposed to punitive.

Capturing More than Just the Numbers

Consistent with moving away from the Scorecard framework, we also decided to move away from having purely numeric goals. Numeric goals have the advantage of being concise and allowing for a straightforward interpretation of whether a goal has been met. For example, to measure progress on our goal to spread effective diabetes prevention programs, we identified a target of having 65 such programs in New York State within a particular timeframe. To truly “keep score,” it became an exercise in counting.

However, it was often unrealistic to set numeric targets in advance. In some cases, we didn’t aim high enough and met the targets more quickly than expected. In others, the targets were too high and were never met. The targets were often arbitrary. Was there something magical about having 65 diabetes prevention programs? Was it meaningfully different from 55 or 75 programs?

Another challenge was whether and how to adjust our targets over time. If we hit a goal early, should we just keep raising the bar? If a target seemed obviously out of reach over time, should we accept that we’d fall short, or revise the target down? A target that is constantly in motion is especially hard to hit.

The numeric measures were also too narrow in scope. We had measures that reflected components of one or two projects within each of the foundation’s larger strategic program areas, but that failed to paint a full picture of what the overall program was trying to achieve. For example, in our veterans’ health work, we counted the amount of federal dollars flowing into New York State to support veteran families. We had funded a specific project to help New York communities compete successfully for these funds — but it represented only a very small part of a broad program to improve veterans’ health in New York State.

In our new Progress Report, the goals for each program are high level and capture a larger breadth of our work. To capture our progress on our veterans’ health work, for example, we now focus on two broad goals — increasing the visibility of veterans’ health issues and increasing access to comprehensive community-based services for veterans and their families — and highlight a range of activities and outcomes related to those goals. When possible and appropriate, we use numeric measures to describe our progress.

Numbers have value, but the availability of numeric data is no longer a primary driver of the measures we choose.

Knowing What It Is and What It Isn’t

Even though we strive to give a full picture of the impact from our grantees and internal staff, the Progress Report does not need to be a full inventory of the impacts from each project. We need to balance a desire to be thorough with a desire to share information that is relevant and digestible. To keep ourselves focused — and to increase the likelihood that external audiences might actually read the report — we aim to provide a high-level view of the goals and progress on core activities.

The Progress Report is also just one of many ways in which we measure progress and share lessons and outcomes of our individual projects or related bodies of work. We also produce Grant Outcome Reports for many of our projects, briefly summarizing the purpose of the grant, which goals and outcomes were achieved (or not), and what lessons can be gleaned from the work. In addition, we issue longer reports to capture the outcomes of clusters of work (for example, a report on the first three years of our efforts to improve access to healthy foods and opportunities for physical activity in six neighborhoods throughout New York State). And we may do more detailed assessments of our work; for example, we commissioned an external inventory of the foundation’s impact on health reform implementation in New York State.

Our Progress Report is a self-assessment; it is not meant to be the rigorous study of every aspect of the foundation.

Providing Context

An issue every foundation must grapple with is how to distinguish its impact from the full range of other contributing forces and factors: the political environment, related initiatives by other public and private funders, and so on. For example, in 2014 (the year the major coverage provisions in the Affordable Care Act took effect), one of our top goals was to ensure that as many New Yorkers as possible had health insurance coverage. And, indeed, many New Yorkers gained coverage that year. But how much of that progress was attributable to NYSHealth’s work, given the many other forces in play? No matter how successful the foundation’s efforts, it would have been disingenuous — not to mention implausible — for us to take full credit for those gains.

Our Progress Report format allows us more flexibility to describe our impact with a narrative to provide more context for our work and the work of our grantees. It also gives us the space to talk about how these efforts have interacted with those of other key stakeholders.

Balancing the Good with the Bad

We believe our foundation (as well as most — if not all — others) does important work to make a positive difference in people’s lives. However, not all of our efforts have led to the intended impacts, and some have fallen short of expectations. Sometimes the environment changes in unexpected ways; sometimes a particular strategy simply misses the mark. We try to learn from that and incorporate those lessons into our efforts.

The old saying also applies that “if you have met all of your goals, you are not aiming high enough.” Falling short of a few of the goals is not necessarily a bad thing. We believe that the Progress Report should be an honest assessment, and have always tried to reflect this even when we focused on numbers as part of our Scorecard. External audiences have told us that they appreciated the honesty; it bolsters our credibility. We also think sharing our misses and challenges helps separate a Progress Report from just an annual report highlighting all the great things we’ve done (we write those as well).

The approach that we have taken may not be the best one for other foundations. In fact, the approach may need to change over time for any foundation, as it did for us. Self-reflection and evaluation is a constant process for foundations; public documentation of that may only be a small part of that process. But as stewards of resources to advance the public interest, it is incumbent upon us to be transparent about our successes and our failures and to share with each other what we are learning.

Mark Zezza is director of policy and research at New York State Health Foundation. Maureen Cozine is senior director, communications, at New York State Health Foundation.

Editor’s Note: CEP publishes a range of perspectives. The views expressed here are those of the authors, not necessarily those of CEP.

From the Blog