Measuring the effectiveness of training is one of learning and development’s holy grails. Can learning analytics lead you to it?
Learning analytics is something of a buzz phrase in learning and development (L&D). And, although it seems like a posh term for measuring how successful a learning programme has been, it does promise a more scientific approach to this thorny topic. Training and L&D managers should be aware of this.
Basic learning analytics are not new. At their most simple, they are ‘happy sheets’ – the feedback forms handed out at the end of training sessions.
Crude measure
While these can be useful in providing fast feedback to the session leader, and in identifying any major problems, they are a crude measure. This is not to say, however, that all experts are opposed to simple forms of analytics.
Many are applying technology-based enhancements to traditional assessment methods. For example, Alison Sharpe, learning specialist at learning consultancy Grass Roots, says: “In summer 2007, we developed a workbook-based distance learning programme on diversity for the 4,000 officers and civilian staff at British Transport Police. Before going onto further job-specific diversity training, each of them had to demonstrate their understanding of the workbook content.”
The consultancy devised a simple method for this analysis of their learning.
“Each member of staff took a PIN-based interactive test either by phone or by logging onto a website. They were asked 12 questions randomly drawn from a databank of more than 60. The results were then e-mailed over fortnightly. British Transport Police could then see precisely who had gone through, when and with what results,” Sharpe says.
In the US, L&D specialists are increasingly enthusiastic about the potential of technology-based learning analytics.
KnowledgeAdvisors, based in Chicago, is beginning to market its Metrics that Matter tool in the UK. It describes this as “a web-based human capital process measurement, learning evaluation and analytics system that allows organisations to cost-effectively measure the impact of the processes and improve performance”.
KnowledgeAdvisors runs two-day workshops – occasionally in London – where, for about £400, delegates can learn how to walk the learning analytics talk. This includes setting learning key performance indicators and dashboards – numbers-based measurements – and best practice.
KnowledgeAdvisors customers tend to be tech firms and include Microsoft, British Telecom and Cisco.
Data collection
RM Training is also a customer. It provides ICT training to educational establishments in the UK. It wanted to develop an analytics system that would give it a better insight into the strengths and weaknesses of its different programmes, so it could make improvements where necessary. Following recommendations from other training providers, it looked into Metrics that Matter, and chose to implement it after being impressed by the automation of data collection and benchmarking, both internally and externally, that the system allows.
KnowledgeAdvisors claims Metrics that Matter costs on average 1% of the total programme budget. RM Training says, thanks to the tool, it was able to see straight away which programmes needed improvement. Having made those improvements, its customer satisfaction and overall course results went up. Furthermore, it recently received Gold Accreditation Standard from the Institute of IT Training, an achievement that RM believes would have been unlikely without such a thorough analytics system.
Not everyone, however, is so enamoured of the new technologies or the obsession with only measuring what can be counted.
Daryll Scott is joint managing director of personal development consultancy Use Your Noggin, and has provided training and coaching to HBOS, Microsoft, Symantec, Barclays, Honda, The Body Shop, and Baker Tilly, among others.
He says: “How do you evaluate an increase in confidence using an online tool? If someone improves their communication skills, where do you look for the measurable output? We are living in a world of left-brained, process-driven idiots who would be delighted if we all behaved like robots. It would be far more controllable and measurable, but leaves no room for creativity or personal genius.”
Indeed – but for many it is still hard to beat the Kirkpatrick model.
First published by Donald Kirkpatrick in 1959 in a series of articles in the US Training and Development Journal, it outlines four levels of achievement in training.
The first level is the reaction of the delegate. The second level is what the delegates learned. It looks at the principles, facts and techniques that were understood and absorbed by the participants. The third level is the change in behaviour that results from the training, while the fourth is about results and the impact on the organisation.
Enduring popularity
The model will be familiar to most who work in training, and has enjoyed enduring popularity.
Alan Thomas, staff resourcing and development manager for Henkel UK and Ireland, is an advocate. However, he believes it is far from perfect.
He says: “It is a useful tool that allows you to quantify and evaluate the effect of learning interventions. The problem, though, is that as you go up the levels, cost and effort increase exponentially, while reliability decreases exponentially.”
But he warns: “At some point, you have to step off the giddy cycle of ever more complex analyses and make the common-sense judgement that spending on development is beneficial in the main but some of it will be wasted. It’s just like advertising in that respect. We know that 50% of it is wasted, but we rarely know for certain which part that is.”
Indeed, as Thomas suggests, there is a fundamental difficulty with learning analytics. Regardless of the technology or the models you employ, you can never be entirely certain that it was your training that caused any improvements to the business. You could only know that if you had a control case, where the business was not subject to broader economic developments, or where other departments, such as marketing and IT, ceased to do anything.
In the absence of this certainty, those tasked with evaluating the effect of training can at best only measure changes to delegate behaviour. This step is itself a challenge for many organisations.
Initial conversation
Charles Bethell-Fox, executive consultant at Personnel Decisions International, says: “You need to begin by asking how you will know if you’ve made a difference. If you ask this question, and think it through fully, then the deliverables should be obvious and the method for measuring them should be too. No technology can replace the importance of that initial conversation.”
Nigel Walpole, managing director at Bray Leino Learning agrees, and adds that it is also essential to be specific about the desired improvements.
He says: “Most companies bring in trainers saying they want their staff to do something better. Where they go wrong is not saying how much better they want them to be. They need to work out what they want to improve, measure where they are now, and then measure it after the training. That, in a nutshell, is the key to learning analytics.”
Case study: Gofastforward
Gofastforward is an Edinburgh-based training consultancy that provides management and interpersonal skills training to companies in Scotland and the North East of England. Clare Saunders, training manager, says she uses the Kirkpatrick model, but that her focus is on the consultancy’s proprietary Commit to It system.
Sign up to our weekly round-up of HR news and guidance
Receive the Personnel Today Direct e-newsletter every Wednesday
She explains: “Whenever someone has a ‘lightbulb’ moment in our training, we get them to commit on paper to taking action back in the workplace based on that revelation. They share this with their manager, and then check back at intervals to ensure they are following through on their commitment.”
She believes it is an effective way of embedding actual improvements and of measuring those improvements. She says: “Some software companies claim to be able to isolate the impact of training on a business, but I don’t think it’s possible. It’s far simpler and more effective to take 20 minutes out of a seven-hour day to get people to commit to action.”