we await the outcome of this year’s National Training Awards, the age-old issue
of thorough evaluation is thrown into the spotlight once again. Elaine Essery
Evaluation, evaluation, evaluation" could be Tony Blair’s mantra if he
were setting priorities for the training profession. Evaluation has typically
been a stumbling block for trainers. They often get the biggest buzz out of
designing and delivering a training programme, and consider the follow-up
activity a chore.
This age-old problem has come to light once more as the judging process for
the National Training Awards accelerates. From 22 October until 14 November,
regional winners will be announced across the country, and will then go to a
UK-wide ceremony on 3 December where the special award winners will be
Failure to evaluate training adequately is the most common reason for
would-be entrants to withdraw from the awards . Linda Ammon, chief executive of
UK Skills, which runs the NTAs, says: "People come along to our workshops
when they’re thinking of entering and we go through what they need to show.
They often find their evaluation has not been comprehensive, and some decide to
delay entering for a year until they’ve done a full evaluation.
"We’re looking for entrants to identify the outcomes of their training
and quantify them. What makes an NTA winner is exceptionally effective training
– it’s got to have made a big difference." Training professionals owe it
to themselves to prove their worth to their company in added value. Evaluating
training outcomes is part of that process.
"If you don’t do it properly and can’t prove your worth as a trainer,
you probably don’t deserve to be in a job," says Jan Golding, training and
development manager – business development at the Hilton Birmingham Metropole.
Moving into training after 17 years in sales and business management, Golding’s
strength is evaluating the impact of training on the bottom line. She believes
evaluation should always be the first consideration.
"When someone comes to you as a stakeholder on a piece of work they
want you to carry out, sort out the evaluation before you do anything else –
even before conducting a training needs analysis," she advises. "If
you ask yourself ‘how can we measure the success of this after the training has
been delivered?’ you will design and deliver a training programme that fulfils
the business need."
Identifying the measures will help to establish whether a provable training
need exists, reducing the risk of throwing training at a problem which does not
have a training solution. If asked to deliver negotiating skills training to a
sales team, for instance, asking why such training is needed may prompt the
answer that sales figures are down on last year. But it could actually be due
to a dip in the market, or some other external factor.
Jane Exon, Learning and Development Director at NTA winner Debenhams,
agrees. "You’ve got to do your homework before you dive into some glossy
training," she says. "If you think evaluation is hard, question why
you’re doing the training in the first place, and be really clear.
"If you’re doing something on leadership because you think your leaders
are poor, go back to why you think they’re poor: is it because team members are
leaving, the team’s not very productive or people aren’t motivated? How do you
know? Because of labour turnover figures, productivity levels and climate
surveys. Well, there you are then – those are your measures for
To demonstrate the added value of training, evaluation must be more than a
one-off activity. Regrettably, some organisations fail to move beyond the
post-course ‘happy sheet’, which is just the first step in what should be a
multi-stage process. You do need to know that the training has worked at the training
room level and that those undertaking it are satisfied. But you then need to
check that they have learned what was intended, and have been able to apply the
knowledge to the workplace.
Finally, you need to know whether it has made a difference at business
level. "If you’re constantly part of a commercial conversation about how
the business is performing and where the next area of need is, what you do will
have longer-term results," says Exon.
Corporate or commercial measures are necessary for gauging the business
benefits of training. But to fully understand how well your training has
succeeded, you need to evaluate individual – and possibly team – performance.
Golding discovered this when she entered a sales training programme for an NTA.
"I evaluated whether people had learned what I wanted them to learn and
whether they used it in the workplace by observing groups, but I didn’t
evaluate each team member one by one. It worked for the group, but I didn’t
know how well they were doing on an individual level. That’s what I’m doing
Again, it comes down to having existing measures, Golding stresses. She
believes that hard and fast measures at an individual level before and after
the training are essential -not simply team targets, or an overall product
The more established the commercial measures, the easier it is for trainers
to demonstrate their contribution to sustainable improvement, says Exon.
The real challenge in evaluation is whether the measures stand up over time
so you can keep going back and seeing continued improvement in performance. It
also indicates where further training and development may be needed.
Debenhams won its NTA three years ago. Because the commercial measures are
still in place, the company can tell if the improvement has been sustained.
"The things which have been most successful are the areas where we’ve kept
going back. Sometimes we’ve gone back and seen the improvement has started to
slow, so you turn the training back on and it kicks off again," says Exon.
Adopting such an ongoing approach takes the chore out of evaluation, which
then simply becomes part of the normal way of working rather than an extra job.
It also allows for the fact that not all training will show instant results.
There may be a small initial improvement, but employees may need time and
further help to develop their skills, put their knowledge into practice and
gain confidence in the workplace before the full benefits of training begin to
show. A steady curve of improvement should be the result.
It is not just the intended outcomes of training that are relevant. The
really big prize when doing a thorough evaluation is discovering a number of
additional, unexpected outcomes.
Ammon explains: "You may be expecting certain benefits, but you often
get more. It could be increased staff motivation, reduced sick leave, greater
loyalty and attitudinal benefits, which all have a real effect on the bottom
line. That is a big message for trainers – they need to demonstrate the
attitudinal benefits just as much as the increase in skill levels."
Ammon is encouraged that the quality of evaluation has been improving since
the NTAs started 15 years ago. She puts it down to the dissemination of good
practice through NTA workshops and case studies, together with the focus placed
on evaluation by Investors in People (IIP). Many organisations have fallen down
on evaluation when seeking IIP recognition. Others recognised their
shortcomings and are now reaping the rewards of sharpening up their practices.
Gerry Farrelly, director of training at building services design and
installation group Farrelly Facilities, says: "When we went for the
national IIP standard we found out we weren’t monitoring training properly and
weren’t getting value for money. We
recognised that although we were spending money on training, we weren’t giving
feedback to employees to let them know where they stood."
As a result, the company introduced monthly personal development reviews for
all staff. It includes evaluation of training undertaken and assessment of
additional training and development needed to help individuals excel within
their current roles.
Training evaluation is closely tied in with project monitoring. If an
engineer has received internal or external training in project management, for
example, their skills in running the project are evaluated and any problem
areas identified which in turn may point to other training needs.
"IIP has led to a much more focused approach to training and
development," says Farrelly. "It has been the catalyst to our company
Top tips on evaluation
– Be clear how you are going to evaluate before you design the
– Set measures against which to assess the effectiveness of
– When evaluating, check for benefits additional to the desired
– Treat evaluation as an ongoing part of your job, not as a
– If you can’t evaluate it, don’t do it