If it moves, measure it

When it comes to cost-effectiveness, you don’t need a calculator to show the advantages that online training has over a traditional classroom approach.

Instead of sending your staff somewhere else to learn, you can simply set them up in front of a PC with an internet connection. No travel or accommodation costs, expenses or instructor required.

Initially, this was enough to secure some quick wins for suppliers and brownie points for training managers, but it wasn’t long before questions were being asked. “It was cost effective, but was it learning effective?” asks Susan Honor, e-learning consultant at Ashridge Business School.

As it turned out, there proved to be a world of difference between the two.

E-learning doubts

Several years on, huge doubts remain over e-learning’s true effectiveness. Research carried out at the end of last year by Training & Coaching Today indicates that opinion, while mixed, errs on the side of negative. Only 11% of 135 training professionals polled thought it was ‘very effective’, although a more encouraging 58% believe it is ‘fairly effective’. The Chartered Institute of Personnel and Development’s (CIPD) findings proved to be more emphatic. Just 1% of 635 CIPD members polled in the 2006 learning and development survey said e-learning was the most effective method of learning.

Such findings will hopefully force more training managers to re-think not only how they use e-learning, but also how they judge its effectiveness.

There is no shortage of metrics available to measure online learning, with learning management systems (LMS) dutifully recording enrolments, completion, progress, scores, certification, drop-out rates and more. While useful, whether such data is the best way to gauge learner effectiveness has always been debatable. More significantly, it is likely to have even less relevance if training managers learn to play to e-learning’s true strengths.

Online learning has proven most effective at providing just-in-time training at the point and place an employee needs it, and as a key 24/7 support tool for learning in general. This may involve accessing just one part of a course, or downloading an online reference guide to solve a particular problem.

“Course completion used to be one of the main ways e-learning was measured, but it’s not about sitting through a course from A-Z any more,” says Kevin Young, managing director of e-learning provider SkillSoft. “It’s about immediacy, tapping into the learning when you need it and using it to support informal learning.”

A measured approach

SkillSoft’s recent Benchmark Report on learning and development revealed that most respondents viewed measurement as necessary, but only if it measured the right things, rather than being an exercise in internal referencing. There was no consensus on how to go about measuring effectiveness.

Although most said they use ‘happy sheets’, pre- and post-assessments and other general assessment techniques, it was felt these were used to ensure best value from suppliers. They didn’t show how learning was being applied at work.

Young believes the key to measuring e-learning and learning of all kinds is to put it in a wider business context. “The learning and development community has got to align learning with the organisation’s strategic business goals,” he says.

Alan Samuel, head of UK operations for Tata Interactive Systems, agrees, and says that everyone in the e-learning sector must make a similar mindset change.

“E-learning specifiers and buyers need to move from learning objectives to business objectives,” he says. “The classic measurements of e-learning still exist – reducing the cost of training, reducing employees’ time away from their jobs, and so on – but e-learning is not about reducing costs, it’s about meeting defined business needs.

“We’ve designed sales training for Vodafone delivered via e-learning, and have been aware from the outset that the measure of these materials’ success is the percentage increase in Vodafone’s sales,” he says. “That is an easily recognisable measure of success, and one to which we are happy to work.”

Training Synergy, one of the UK’s largest suppliers of freelance IT trainers, measures e-learning’s effectiveness in its own business. It measures the impact of any training or learning activity against its own key performance indicators, and it expects clients to do the same, says its director of training, David Field.

“Where we have trained people in banking systems, the measures of the training’s success have included the reduction in waiting times and branch sales performance ratios. For example, how many more foreign exchange deals the branch is now putting through,” he says.

Evaluating effectiveness

“Of course, this raises issues of how to carry out these measurements effectively and efficiently, and that’s where floor walking – the human element – plays an important part in evaluating the effectiveness of the e-learning. It also helps to create a sustainable learning set.”

The difficulty with relying on business results as a method of assessing training, is that it can take several months to acquire any meaningful data. Jane Knight, head of research at Learning Light, a not-for-profit organisation that advises on e-learning, recommends that interim measures should be put in place. These should assess user satisfaction, finding out whether the individual feels they’ve learned anything and, crucially, whether they perform better after the training.

“If you’re hitting the mark on all of these things, the business benefits should follow,” she says, adding that training managers must define what and how they are measuring from the outset.

“The first question to ask before implementing any e-learning solution is: ‘Why are we doing this?’ Above all, remember that it’s about using e-learning as a means to an end, rather than an end in itself.”

The move towards measuring effectiveness against business goals and personal development doesn’t mean that we should ditch all the traditional methods of measurement, however. ‘Happy sheets’, staff surveys, and pre- and post-assessments all still have a part to play.

Sophisticated business intelligence software – either built into an LMS or available as a bolt on – provides further options to drill down and unearth valuable insight and information.

Get analytical

Tools to look out for include the web-based learner-evaluation system Metrics that Matter, developed by US-based Knowledge Advisors. This allows you to measure the effectiveness of your investment in learning through a customised survey.

“The tool benchmarks programmes and measures and improves training programmes and provides accountability on pounds spent on learning,” says Colin Terry, managing director of New Wave Learning, which recommends the product to its clients.

Users of the tool include PeopleSoft, which relied on it to measure the impact of learning at its PeopleSoft University, and Microsoft, which employed it to measure the training performance of those undertaking courses from its certified vendors. For the latter, the software reported the results of hundreds of training events around the world instantaneously.

The debate over effectiveness will no doubt continue with each new survey that appears. But as online learning starts to become just one of many delivery tools for learning and development, we can hope that the arguments start to fade with it.

When used to address specific business needs and as a support tool for informal or formal learning, we already know it can deliver value for both the organisation and the individual. It is down to the training managers to know when e-learning is a good fit (or not), and then effectiveness should naturally follow. 

Case study: Aligning to business objectives


To support the instructor-led training of around 2,000 of Halifax and Bank of Scotland’s (HBOS) ‘technical colleagues’, so they could facilitate the migration to the Windows XP software platform from Windows NT4 across the organisation.

The solution

HBOS gave technicians involved in the migration access to the ITPro online library from SkillSoft’s Books24x7 to support and supplement its instructor-led training programme. Typically, technicians used Books24x7 to help find answers to job-related questions (86%), learn a new topic or skill (85%), and deal with changing information and knowledge needs (84%).


Rather than ‘happy sheets’ and course completion statistics, HBOS judged the effectiveness of the online support against business objectives. It had limited time to do this as Microsoft’s support for NT4 ends soon.

It surveyed technical staff at an early stage and found that each was able to save an average of one hour per week by being able to find the information they needed immediately.

This equated to 6,150 hours each year which, taking a ‘conservative’ hourly rate of 20, gave a productivity saving of more than £129,000 per annum.

HBOS says it significantly reduced project up-skilling times, which enabled it to complete the project in time. Also, 95% of staff said Books24x7’s content is relevant to their role, and 93% said that it will expand their skills and increase their personal development.

“The feedback we received subsequently helped us to prove return on investment because of the amount of time people told us they were saving,” says Debbie Rawlinson, training manager of the project. 

Top tips

  • Align effectiveness measurement to business objectives.

  • Define your measurement criteria and approach from the outset.

  • Understand why you are embarking on the e-learning project – unless you do, you won’t know what to measure.

Comments are closed.