journey-evidence

Leadership development: some worrying disconnects

A recent post from HR Magazine – Leadership development ‘disconnected’ from business needs – caught our eye and got us thinking about how leadership development methods are viewed by organisations. The article references the 2015 Leadership Survey by Mercer, where 81% of organisations surveyed fail to calculate the return on investment made in leadership development. The survey also points to a disconnect between the learning and development methods that organisations think are most effective and the ones they actually use.

This news is worrying on a number of fronts. Firstly, if organisations believe there are superior leadership development methods available, but fail to use them, we have to wonder why. If, as the article suggests, face-to-face learning is ranked as one of the least effective leadership development methods and yet is used by 63% of firms, are L&D professionals ignoring their effectiveness or just heavily wedded to one method?

Secondly, we would question the ranking of the development methods outlined. What is the evidence for online learning being a superior method of leadership development to face-to-face learning? For all employees? In all kinds of organisations? It’s doubtful that’s the case.

This kind of message has the potential to influence thinking about L&D methodologies, regardless of the context in which they’re applied. And in taking an evidence-based approach, we have to remember the impact on, and perspectives of, our stakeholders. Online learning won’t suit all employees, any more than hot-desking or flexible working will suit all employees. It’s a top-down, one-size-fits-all mentality that doesn’t reflect the diversity of employees.

Thirdly, the low percentage of organisations whose leadership strategy has an associated business case points to another disconnect – that between development activities and organisational strategy. If you’re not developing your future leaders in line with your strategy, you’re simply developing them to move elsewhere, where they’ll find a better organisational fit. At best, you’re developing people into the kind of leaders you needed five years ago, not the leaders you’ll need in the next five to ten years.

Finally, the lack of ROI calculation is worrying – but perhaps not that surprising, as we’ve previously referred to the minimal engagement in evidence-based training and development evaluation. Leadership development represents a significant investment, one that should be evaluated just as any other investment. And while evaluation of these kinds of initiatives can be complex, they’re not impossible.

To connect leadership development with business needs we recommend:

  1. Linking leadership development methods to the evidence base and ensuring they are a good fit for the organisational population being developed. This has to go beyond what feels right, or what has always been done.
  2. Ensuring a robust evaluation process is in place before the leadership development activity even begins. Simply handing out ‘happy sheets’ is just not enough and can’t hope to contribute to an ROI calculation in any meaningful way.
  3. Working to ensure that leadership development strategy is firmly aligned with the organisational strategy, and reviewing this as frequently as strategy is reviewed. This ensures alignment and a periodic refresh of the leadership development ethos and focus.

We’ve also put together a set of resources and events to help you apply an evidence-based approach to evaluating training and development in your organisation.

happy-sheet

Beyond the ‘happy sheet’

It’s the end of the financial year and your CEO corners you in the lift and asks: ‘How’s that £2 million training programme going? What impact has it had on our productivity?’ Would you be able to answer?

UK businesses invest significant sums of money in training and development every year – investment that should be objectively reviewed to calculate a return and evaluate its impact. However, the CIPD’s Learning & Development Report 2015 illustrated that the evaluation of training and development initiatives is far from widespread in the UK.

According to the CIPD, one in seven organisations surveyed undertake no form of training and development evaluation at all. Whilst over a third of organisations reported limiting their evaluation to measuring the satisfaction of employees taking part in the activity – what is often referred to as the ‘happy sheet’.

What’s wrong with the ‘happy sheet’?

When organisations rely on what delegates report immediately following training, they miss out on a rich source of data and evidence. Typically, delegates are asked to complete a short survey about their experience of the training, how useful they found it and what observations they have about factors like training venue, facilitator and materials.

This kind of information gives you a snapshot of what the training experience was like, shortly after the training is completed. But what does this tell you about how much delegates learnt, remembered or applied in practice? Very little, if anything.

It is useful information to gather, but represents just one piece of the evaluation puzzle. Let’s look at this another way: are you investing in training and development activities to raise employee satisfaction with training? Is that really your key metric? Probably not. You should be measuring outcomes at a more detailed level, focusing on the factors that are important to you, your delegates and your organisation.

Reliance on the ‘happy sheet’ is superficial at best. It doesn’t tell you how much the training is working, for which employees and in what way.

So, what should organisations do?

While not a new concept in itself, evidence-based practice in the workplace has grown in popularity in recent years. It involves focusing on the evidence behind interventions (e.g. training and development), evaluating this evidence and using the outputs of evaluation to inform decisions. Evidence-based practice means we don’t have to rely on blind faith or gut feel. Or ‘happy sheets’.

Adopting an evidence-based approach means we rely less on what has always been, instead questioning assumptions, seeking evidence, weighing up data and making decisions with the intention to review outcomes and learn from our experience. And learning from experience makes the evaluation an ongoing process, not something that’s only done once.

Evidence can take many forms, including scientific data, stakeholders’ perspectives and your own professional experience. Combining multiple sources of data and evidence will give you a more rounded picture of any organisational challenge and prevent you from making decisions based on a single data point.

Fundamentally, evidence-based practice gives organisations increased clarity on what works, for whom and in what way.

How we can help

We’ve developed an evaluation framework and a set of resources and events to help you adopt this approach in your organisation – as well as field those difficult questions from your CEO.