What gets measured, gets done… What then is a meaningful set of KPIs?

What gets measured, gets done… What then is a meaningful set of KPIs?

Not everything that is valuable can be measured so how are these things included via an evidence base?

You hear it a lot, and there is plenty of evidence out there to show, that if you set a numerical target for something, measure and monitor it then lo and behold you’ll achieve it. Why? Well because that is what you are being judged against, that is what you measure and so that is what you will deliver on.

Setting meaningful and useful KPIs that allow you to determine how you are tracking against a key target or goal is of course important, but you also want to be sure that in setting the goal you are not setting up perverse outcomes, narrowing down your focus so much that you make yourself, blind to other opportunities, (opportunities that could get you to your goal in a different, maybe even a better way), or realising that actually you are not delivering what you really need or even want to as the outcome of your endeavours.

Often it seems that the drive to have a number, so someone can go “look here is a number and it is going up or down at an appropriate speed” – depending on what you want, overrides everything else. And perhaps most importantly since not everything of value can be distilled down to a nice neat number, it may lead to us completely losing sight of those things from which you truly could derive the greatest value, including the value of your people.

Like so much of what I do at the moment I seem to run into the same things everywhere – the use of quantitative KPIs as a way of judging worth and progress has dominated my life for probably the last year ever since we had our site visit during rebidding and the panel asked me “Kate how are you really going to know if you have hit your targets?” But it also pervades my new job, my role as a senior female, my role as Director, my role as a PBRF eligible researcher and …

Just two examples (outside of my direct sphere just to shift the spotlight away from me). Last year Minister Joyce released the Draft National Statement of Science Investment. For years researchers had been encouraged (that may be a softer word than reality might reflect) to chase the target of increased research outputs, the more the better, as the measure of achievement and worth. (Remind anyone of universities being funded purely based on the “bums on seats” approach?)

Chart 8 in the statement shows data for NZ, other small nations and the OECD average. Of the options included we have the highest number of publications per $M PPP government and higher education P&D expenditure in 2011, but we have the lowest % of publications in top 10% of cited publications. Highest quantity, lowest quality (on the assumption that top 10% is a proxy of quality – and of course we could argue that one extensively!) We got what we wanted – lots of publications! Was that the best outcome for the investment? Well we could argue that one for quite a long time as well I would imagine.

The second example is representation of women in, in particular, Science, Engineering and IT careers, but generally, women in high level positions in any field. 30% representation is widely used as a ‘good measure’ of representation, or diversity; a measure that allows you to have great dynamics, to get an appropriate range of perspectives. Not only 30% of the population that is female, and yet it is the universal measure – and where are we stuck in terms of representation or participation? Yep, around 30%.

Quantitative measures are enticing, even intoxicating, easy (well having spent a lot of my time this past year setting them – I might not think they are so easy), transparent, but alone they are not sufficient, can lead to a focus being off and even be ultimately destructive.

Value isn’t fully or even sufficiently captured by quantitative measures, monitoring progress can’t be done purely by numbers. Developing a useful monitoring and measures framework must include a fuller consideration of value to drive desirable behaviours and allow optimised outcomes to be achieved from people’s efforts rather than just chasing the numbers and the $.

Congratulations to the four funded CoREs announced by Minister Joyce yesterday

The competitive process has been a very long one, but it is finally over and now we can begin to consider synergies and possibilities for working together or at least not reinventing the wheel ten times over. aCoRE – the association of CoREs, which we lead, having been in hiatus during the biding rounds, will now be reinvigorated. The first meeting will be in June.

Been a weird warm and very windy time in Wellington lately, let’s see what is in store for the weekend.