Don’t be mad at machines

During the general election campaign, unemployment – the most wasteful HR policy in the history of humankind – barely got a look-in. “The lowest unemployment for 29 years,” crowed Gordon Brown, as he surveyed the industrialised world, and pronounced himself satisfied. France’s unemployment sits at 9.7%, Germany’s at 9.5%, Poland’s at 18.8%. Even the US and Australia have 5.5% of their respective workforces unemployed, so it is not hard to see why unemployment in the UK – at 4.6% – has, for the moment, ceased to be ‘political’.

The situation is far from rosy: there remain pockets of severe regional unemployment in the UK; the private sector has more or less stopped creating jobs since summer 2003; the Labour government seems to be re-nationalising the workforce through the expansion of the public services; and the number of people on incapacity and related benefits – at 2.5 million – is too high.

Yet looking back on the dole queues of 20 years ago, it seems crazy to quibble: in comparison with other countries, Britain’s jobs market is an astonishing economic success story. The experience of becoming unemployed has been compared in some studies to bereavement or divorce; the less of it, the better.

But I would like to think that today’s full(ish) employment also contains another optimistic, if easily forgotten, lesson for us, and it is this: machines – what on earth were we so afraid of?

The years between 1975 and 1995 were remarkably rich for the technological gloom industry. Techno-logy – an amalgam of the Greek words techne, meaning skill, and logy, denoting an organised body of knowledge – has always been feared as a destroyer of jobs. The acceleration of scientific knowledge during these years led some highly regarded experts to believe the goal of a technological society – ridding the world of the worker, though perhaps not to the level of the baddies in The Terminator – was finally achievable. Between automation, robotics, nano-technology and microprocessors, the elimination of the work of millions seemed imminent.

“It is impossible to over-dramatise the forthcoming crisis as it potentially strikes a blow at the very core of industrialised societies – the work ethic. We have based our social structures on this ethic and now it would appear that it is to become redundant along with millions of people.” So declared Clive Jenkins (the former union leader) and Barrie Sherman in The Collapse of Work, published in 1979.

Because machines could perform ever more complex mental functions, any worker who followed instructions, whether in the manufacturing sector or in an office, could theoretically be replaced: painters, cashiers, mechanics, secretaries, packers – the list was long. “This means that the role of humans as the most important factor of production is bound to diminish,” wrote the Nobel laureate for economics Wassily Leontief in 1983.

In theory, they had a point. Only a mad employer would choose to employ people if the work could be done by machine. Technology takes no coffee breaks, is happy to work three shifts, does not call in sick on Mondays, does not become bored or qualify for a pension, and never asks for better wages.

There can be no doubt that technology displaces labour (as the Luddites knew well). The vital question is whether it also creates jobs at a faster rate than it destroys them – if technology is capable of “creative destruction” in economist Joseph Schumpter’s phrase.

If the introduction of new technology leads to new products and processes, which in turn stimulates markets and creates jobs, then the short-term pain of evicting workers from inefficient, unproductive work may well be as economists like to say it is – a price worth paying. From the prosperous vantage point of today, this appears a reasonable argument. But 20 years ago, few authors could see anything creative in the destruction.

That the market for labour was ‘ending’ was a widely held view. “The time may come when society may have to find alternative means for sharing work and distributing the wealth that it generates,” warned a 1984 article in the journal Technology in Society.

This bleak message was taken up again a decade later in polemical fashion by Washington sage Jeremy Rifkin. Employment is a measure of self-worth, he noted, seeing a “clear correlation” between technological unemployment and “psychological morbidity” and depression. His 1995 book, The End of Work, warned: “If the talent, energy and resourcefulness of millions of men and women are not redirected to constructive ends, civilisation will probably continue to disintegrate into a state of increasing destitution and lawlessness from which there will be no easy return.”

I would like to think that the British employment record sheds a certain light on such fears. Arguably, the role of technology can be overdone, as the boom occupations of the 1990s – housework, hairdressing, shelf-stacking, car-washing – require no higher skills than the older jobs they replaced. Yet the crucial point is that technology has not rendered the human factor obsolete, only altered its application in diverse and surprising ways.

The widespread terror of the machine has been found to have been misplaced – paranoid, even. When employment begins to fall again, all the old anxieties will doubtless return. But for now, we can love the machine with all the ardour of a wage-earner.


Comments are closed.