By Molly Penn
Measuring your success in executing your strategic plan starts with defining what success looks like. The starting point for measuring a strategic plan is the clarity of your vision of impact. This shows up in your vision statement, your mission statement and the goals of your strategic plan.
Why is measuring impact so challenging? Where measurement gets tricky is how do you know how much of the progress towards your mission is attributable to your work vs. to circumstances around you?
Now you might say, who cares about attribution as long as it’s working? And that’s a perfectly acceptable point of view. But the added nuance is that ideally, organizations develop strategic plans so they can learn about what is most effective.
If you can’t attribute the progress entirely to your efforts or discern which portion of the progress is attributable to your efforts, you haven’t really learned about the most effective ways to deliver on your mission. So this is the rub about measuring strategic plans. Let’s look a bit more deeply at this idea.
This cone model comes from futurist approaches to planning. It illustrates that the more you seek data, evidence and certainty, the less aspirational you can be. If you want exactitude, you need to measure the more tactical pieces, which are accomplishable generally in a short time frame – the next year or two. These also tend to be things you have more exclusive control over.
By its nature, the farther you move outwards towards goals and systems level evolution (and the longer it will take to accomplish your intentions), the less data, evidence and certainty you will have. These are the areas where you don’t have exclusive control (which is the whole point – it is systems level change with multiple players).
It’s important that we understand this when thinking about measuring strategy. We need to be clear and intentional about what we are seeking information on and what that information will tell us.
Where we often see organizations falling into a trap is when they only measure “did we do what we said we were going to do” instead of measuring “did our work produce the outcomes we were hoping to produce?”
The reason many organizations default to “did we do what we said we’d do?” is because the board wants to know that the staff is working the plan. Fair enough – but that is a wasted opportunity to really engage the board with the staff in wrestling with the big questions and challenges in the work. Both are important – you don’t want all of one or all of the other. Let’s think about what we want to learn by measuring these things.
Did Our Work Produce the Outcomes We Wanted?
Let’s say you look for a particular outcome of your work – something specific will change for certain people you serve – and you’re not seeing it. If you provided the activities you expected to provide, but you’re not seeing the change, that tells you that the either the activities were off-base, or your assumptions about them were off-base. Maybe you focused on the wrong areas, or maybe your activities aren’t enough by themselves – maybe you need to supplement them with other kinds of activities.
Alternatively, it could be that your assumptions were just miscalibrated. You learned along the way that those activities did not lead to the outcomes you wanted to achieve. So you need to figure out what other kinds of activities would be more effective.
Did We Do What We Said We’d Do?
As we mentioned earlier, many organizations default to did we do what we said we’d do because the board wants to know that the staff is working the plan. This is where boards often fall prey to thinking their role is “supervision” of the staff, instead of thinking about impact.
When you combine these two kinds of measures – did we do what we said we’d do and did it produce the outcomes we intended – then when you look at your outcomes, it leads to some really meaty conversations with the board about whether that was the right approach and what you are learning.
Be Selective About What You Measure
While measuring your work is considered best practice, we are advocates of only measuring the most important pieces of information – not everything. Otherwise, you’re making administrative work that pulls you away from doing the work of serving your mission. If you’ve chosen approaches that others have proven to be solid approaches (like research in order to educate or raise awareness), those probably don’t need to be measured to ascertain their effectiveness. We know from other experiences that research helps educate and raise awareness. It’s about being thoughtful and strategic about what you measure.
Measure to Learn
The common misconception about measurement is that it is a kind of report card – what was your grade?
In fact, the best practice is that measurement should lead to learning – it is a way to keep you focused on the biggest picture, which is learning what is effective to fulfill your mission.
When to measure should be driven by the level of activity you are measuring. If your KPIs are related to tactics, you probably want to measure them annually. If they are related to outcomes, you might want to measure every several years. Realistically, the timing of measurement is determined also by capacity factors:
- When do you have time to convene everyone to learn from what you measured?
- Do you have the systems built to measure and use the data to support institutional learning?
Building Learning Organizations
That brings us to learning. It is important to ensure you can hold space to process what you are learning from your metrics. What are they telling you about effectiveness against your mission, which is the overriding litmus test of your strategy. This is one of the best conversations you can have with your board – because you have a group of people from different organizations with different perspectives. Getting them to help you process these learnings will lead to rich dialogue, greater understanding and appreciation of the work and a more effective board. The outcome of these processing sessions should be about what kind of actions you want or need to take as a result of your learnings.
We actually developed this template to help organizations get over the common fear of making changes to their strategy. So much time and effort goes into coming up with strategic plans there is often a reluctance to make any changes. But what keeps plans alive and effective, is iterating our approach so we get better with time. This can be completed by each board member in a learning session and compare results at the end, or you can complete it as a group together. The point though is not to overthink it or let it make you feel stuck. It is meant to help you be agile and adaptive in how you approach your strategic plan.
In the end the purpose of measuring is to learn from what you’re doing so you can get more effective at delivering on your mission.