Industry 4.0 and the productivity problem
Image credit: Pixabay
Despite rapid developments in technology, productivity isn’t showing anything like the same level of improvement.
The fourth industrial revolution is supposedly upon us, the internet is now part of almost every business and machine learning is moving into the mainstream. Business productivity should surely be surging ahead, yet economies around the world are stuck in a rut.
Since a burst of rising productivity in the 1990s, growth in the economic metric has slumped – and it was a downturn that started well before the crash of 2008. The numbers are not getting better. The latest figures from the UK’s Office for National Statistics show that productivity as measured by output per hour has fallen by 0.5 per cent from the fourth quarter of 2016 to the first quarter of 2017.
In a speech at the London School of Economics in the spring, Bank of England chief economist Andy Haldane pointed to productivity as being largely responsible for the rise in living standards enjoyed by UK citizens since the first industrial revolution more than two centuries ago. It has been a rollercoaster ride since the Second World War but the country had been catching up with its peers in the west up to the point when the collapse of the Northern Rock bank signalled the end of the mid-2000s boom.
Since then, as Haldane points out, the Bank of England has over successive years been forced to reduce forecasts of productivity growth. And, though the country’s productivity performance continues to lag France, Germany and the US, among others, the developed world has joined us.
“This tale of productivity disappointment, in forecasting and in performance, has been extensively debated and analysed over recent years. Some have called it the ‘productivity puzzle’,” Haldane says. “With each year that passes, and as each new turning point in productivity has failed to materialise, this mystery has deepened. This has led some to conjecture that the world may have entered a new epoch of sub-par productivity growth, an era of secular [long-term] stagnation.”
Some economists have come up with explanations of why the UK is behind France and other EU states. Higher unemployment and fewer hours at work in those countries may mean only the more productive firms and employees are in work – helping to boost a measure that is partly based on output per hours worked. But it does not explain the wider picture. With all the technology being deployed in the workplace, why are people not getting more done?
There are three possible culprits. One is mis-measurement: the formulas and proxy variables used to calculate what economists call Total Factor Productivity fail to capture outputs properly. Misallocation of money might be to blame. A decade of cheap lending intended to stop the world economy from descending into a slump may simply be keeping unproductive organisations on life support. And then there is a failure among companies to organise themselves to be more productive because they have found other ways to increase profits or simply don’t get enough competition to force them to be more efficient. Because the productivity puzzle emerged in the wake of the collapse of the internet bubble, suspicion has fallen on information technology (IT).
In September 2016, US thinktank The Brookings Institution brought a number of economists together to try to work out what is going on with productivity and answer the question posed in a 2013 paper for the US Federal Reserve by David Byrne, Stephen Oliner and Daniel Sichel: ‘Is the information technology revolution over?’
In their paper, Byrne and colleagues argued that the contribution to productivity growth from IT has been falling since the 1990s. It was 1.5 per cent 20 years ago but dropped to 0.5 per cent ahead of the 2008 financial crash. At the Brookings conference, Goldman Sachs chief economist Jan Hatzius argued that the drop may be more easily explained by a shift in what society consumes: “I think there’s a very straightforward story for what could have driven, and I think has driven, a significant part of the slowdown, namely a shift in the technology sector away from goods and items that are measurable, where quality improvement is measurable and where there are quantitative metrics, such as processor speed, memory or storage capacity, to sectors where it’s much more difficult to come up with good quantitative metrics of how quickly quality is improving.”
Hatzius argues computer hardware has become more specialised and software less generic: “The official indices basically say that if you spend $100 on software now, you’re still getting essentially the same amount of real value as you did 10 or 20 years ago, which to me seems quite implausible.”
We also work with much more stuff that’s free or almost free – most webservers are running open-source software. Do all those nominally free services from Facebook, Google and others make up the $3tn in supposed ‘lost productivity’ in the US?
Organisations such as the US Bureau of Labor Statistics claim their ‘matched model’ methods do capture at least some of that value, though they expect to run into more problems of measurement as factory-less production through technologies such as home 3D printing become more common.
The problem, according to Professor Chad Syverson of the University of Chicago’s Booth School of Business, is that finding those lost trillions in IT alone would make the industry six times larger than economists believe it to be. “Do you think we measure things so imperfectly that we hit one-sixth of the actual activity?” Syverson asks. “I think you can definitely reject [the idea] that all of the slowdown [in productivity] is from measurement.”
Another argument against the idea of productivity being a mismeasurement problem was deployed in a 2012 paper by Professor Bob Gordon of Northwestern University. When asked in surveys, few consumers want to give up the phones and tablets they now own because of the personal value they bring. But the technology behind them has had less of a material impact on society than the major advancements made possible by earlier industrial revolutions – electricity, steel construction, mechanical transportation and chemical production.
Although technology seems to be diffusing more rapidly than ever, the more recent inventions have had less visible effect. The heyday of global productivity growth lay in the middle of the 20th century, when annual productivity growth averaged 1.9 per cent per year. Since 1980, it fell to 0.3 per cent.
On the other hand, the issue may be that the future is here; it’s just not evenly distributed. A study focused on Sweden but which may apply more widely found there can be a tremendous lag between the introduction of IT and rises in productivity. At first, Ericsson researcher Harald Edquist and Professor Magnus Henrekson of the Research Institute of Industrial Economics in Stockholm found R&D had almost immediate payoffs but IT and communications investment barely made any difference at all. By taking longer time periods into account they found IT and communications spending took seven or eight years to take effect. The mid-1990s productivity boomlet may have been a delayed effect of the rise of the PC and not rapid adoption of the internet.
Then there is the spread of productivity within industries. A 2015 study by Dan Andrews and colleagues from the OECD drew a distinction between firms “at the productivity frontier” and other companies. The global frontier firms plough money into capital and patent-generating R&D, have larger sales and tend to be more profitable.
“In arithmetic terms, it is non-frontier companies that largely explain flat-lining productivity over recent years,” Haldane says, using evidence from a UK-focused survey. “For a large cohort of non-frontier companies secular stagnation is evident, with low and flat-lining levels of productivity. Around one-third of UK companies have seen no rise in productivity throughout this century. This is a long tail.”
Andrews says the biggest split between frontier firms and laggards lies in the services sector. That may be partly because services companies are more protected from international competition. Another element could be a ‘winner takes all’ dynamic in which the companies with the largest market share take the bulk of the profit and so have more to reinvest in technology, while their competitors are left behind.
Some sectors may simply be highly resistant to productivity improvement, with firms finding other ways to cut costs and increase profitability. The UK’s own gig economy of zero-hours contracts may be a symptom of this effect, and the trend is made more significant by manufacturing that is automating – but automating overseas. Professor John Haltiwanger of the University of Maryland points to the situation in the US and the shift towards retail, leisure, hospitality and personal services – traditionally low-productivity sectors, but which can be profitable.
The combination of the slow pace of IT payoffs and the inability of the wider employment base to keep up with the organisations that are using technology to capture more market share may mean productivity continues to move upwards only very slowly. Even large-scale robotisation may take decades to show up in the aggregate numbers. Perhaps productivity is set to rise into the middle of a century for the third time in a row.
Sign up to the E&T News e-mail to get great stories like this delivered direct to your inbox every day.