For too long, learning and development (L&D) practitioners have claimed that what they do is too subjective to be measured. But business leaders want to see more evidence of why learning interventions work, and savvy organisations are embracing this. Roisin Woolnough investigates.
“It’s fantastic.” “Everyone loves it.” “I don’t know why, but it just works.” For too long now, learning and development departments have relied on statements such as these to make important training decisions. L&D has also trotted out these same statements to validate and communicate results to the business and even the board.
How organisations view evidence-based L&D
77% of L&D practitioners consider learning analytics a priority, but only 18% of line leaders think L&D effectively communicates the impact of L&D interventions to the wider business.
Only 24% of heads of L&D believe their L&D staff are effective at using the right metrics to measure L&D function performance.
Only 16% of L&D practitioners use data and metrics proficiently and only 25% have proficient analysis skills.
But business leaders are tired of this approach and it is easy to see why.
What they want to see and hear is actual evidence that training has worked, why a new training module will make a difference, and that any investments will result in successful outcomes. They make decisions based on evidence and they expect L&D to as well.
So many parts of the business now use data as a business-critical tool and have been for some time. It’s not a new phenomenon, yet many believe L&D has been very slow to catch on.
“An evidence-based approach is very hard to find in most L&D operations,” says Nigel Paine, international speaker, L&D specialist and former head of learning at the BBC.
Instead, L&D has often continued to make decisions without having really hard evidence to back those decisions up.
And when it comes to measuring the results of those decisions, L&D may have been using data, but it has often been the wrong sort.
Success is often measured in terms of the size of an organisation’s training budget, the number of training hours employees clocked up in a year and the number of courses on offer.
These numbers tell L&D and the business very little about how successful the training actually is and whether or not it delivers the required results.
Jean Martin, talent solutions architect at global member-based advisory company CEB, thinks L&D has a lot of data, but extracts little in the way of meaningful insights from it.
“As such, L&D is unable to help business leaders understand the implications of data for achieving their business outcomes,” says Martin.
It is time for L&D to modernise and that modernisation means being evidence based, like everyone else. “It needs to be about outputs, not inputs”, she argues.
Being evidence-based means using evidence to make, validate and communicate decisions.
“It’s being able to show what you are doing, why you are doing it and what has happened as a result of what you are doing,” says Paul Moore, operational training manager at the financial services company LV. “As a training manager, that’s what the business is interested in me showing them.”
Evidence based L&D focuses on outputs, such as performance improvements, accelerated time to competency and behaviour change as a result of learning. Evidence should permeate and inform every step of the decision making process.
“We are moving into a more quantitative world in L&D and HR, driven by big data,” says Paine. “We are moving towards a tipping point where it’s unacceptable to just talk about how brilliant something is and get away with it. People want evidence.
“There’s a huge amount of inevitability in all of this. L&D has to wake up now or it will be forcibly woken up in the future.”
What Paine is talking about is the complacency of L&D – and HR as a whole – an historic attitude of accepting assertions at face value and then repeating those assertions as bona fide to others.
“People are willing to trot out things for which there is no evidence, which means that myths and fallacies endure in learning in a way that they wouldn’t in other professions,” he adds.
XpertHR resources: learning budgets
“I see people do presentations with slide after slide on a screen talking about things like learning curves, but with no attributes anywhere. It’s all about you believe me, I believe someone else and so it goes on and that’s when you get total rubbish being postulated.”
Paine thinks it is unacceptable in these times of increased data and accountability for anyone to say “We know it works but we don’t know how it works”, yet he says L&D still does.
The business thinks it’s unacceptable too, which is why it is starting to ask L&D for hard evidence.
Laura Overton, managing director of benchmarking organisation Towards Maturity, agrees that L&D is lagging behind other functions in terms of offering evidence and insight.
She says the L&D community is ready for change but doesn’t know how to do it. “Many L&D leaders are looking to modernise their learning, but maybe they are held back by a fear of numbers,” she says.
Overton thinks L&D should look at how marketing uses evidence to drive performance and learn from its approach. “Everything that can be tracked, they track and then adapt their behaviour continually to improve performance.”
In its latest research, the 2014-15 Towards Maturity Benchmark Study: Modernising Learning, Delivering Results, Towards Maturity highlights the benefits enjoyed by organisations that are delivering truly evidence based learning. These include higher productivity, higher revenue, higher customer and staff satisfaction.
The report stresses the need for L&D to use all the evidence available to align itself with the business, to operate as business partners that use strategic business objectives to determine learning priorities.
That way L&D can deliver training that produces real results for the business.
“When we looked at the top performing 10% in our sample, we found that 100% of them are making learning decisions based on business decisions,” says Overton. “They are aligning learning to where the business is going at the moment.”
According to the Towards Maturity research, those highly aligned companies are
- 13 times more likely to report increased revenue;
- 9 times more likely to report increased productivity;
- 5 times more likely to report improved customer satisfaction as a benefit;
- 50% more likely to have noticed positive changes in staffr behaviour.
In order for L&D to effectively align itself with the business, it needs evidence about what the business needs are, what current skills capacities are and how training can deliver the required results.
Benchmarking can help L&D achieve that because the process is all about evidence. It gives organisations real insight into how they are performing, what’s working and what isn’t and how this compares with other organisations.
“People can ask themselves: ‘What are the top performers doing? How do we compare against them? What are my gaps? If I take action in those gaps, what will happen?’” says Overton.
Overton says L&D needs meaningful evidence about what its customers are doing – how they are learning, where they are learning and what impact it has on their work – rather than thinking they know what they are doing.
When Rachel Faulkner, L&D manager at Warwickshire County Council, decided to go through the benchmarking process with Towards Maturity, she wanted to see if the council’s L&D strategy was on track and how it matched up to other organisations.
“The results reassured us that we were going in the right direction as an organisation and has highlighted some things that need to be done differently.”
The council conducted a benchmarking survey among employees and a learner landscape survey, looking at how employees were accessing learning and what their learning preferences were.
People are willing to trot out things for which there is no evidence, which means that myths and fallacies endure in learning in a way that they wouldn’t in other professions.” – Nigel Paine, former head of learning, BBC
“One thing that came out strongly was that people want to collaborate,” says Faulkner. “We have a big geographical spread, lots of practitioners working out and about and the feedback was that collaboration is important to them.”
As a result of that feedback, the council is developing communities of practice where people can communicate online.
It is also developing internal coaching pools and looking at providing a collaborative space specifically for newly qualified social workers.
There has been a shift in the council’s evaluation strategy as well. It is changing and extending the evaluation process so that it looks for behaviour change, monitoring how people have put learning into practice. This will result in more evidence for the L&D team.
Another strand of the council’s new action plan is for L&D to be curators of information, signposting people to the learning and information out there.
The council is about to embark on a new benchmarking survey and learner landscape survey and Faulkner looks forward to seeing if any of the changes have taken effect yet.
The pressure is building for L&D to be more evidence based. When the board starts asking for evidence, that’s when L&D can’t ignore the inevitable any longer. “You can’t pitch up to a steering group or the board and say this is what we think, it’s a really magical idea,” says Moore. “L&D needs to show evidence why things will work.”
If it does not, then L&D will be marginalised and sidelined from strategic decision making.