Moore’s law: All things good come to an end

Peter Crush

Wednesday 23 August 2017

In recent decades, workers have relied on a law that most have never heard of. It’s a law that enables them to work remotely on devices with increasing speed and processing power.

Moore’s law – a hypothesis made more than 50 years ago by American engineer Gordon Moore (who would later co-found Intel) – is beautifully simple. He predicted that, thanks to better technology and manufacturing, the cost of computer chips would halve every two years, and their processing capability would double.

The end of an era

The prediction has so far stood the test of time. However, academics now argue that Moore’s law could grind to a halt by as early as 2019. Intel has already pushed back its latest chip (featuring elements as small as 10 nanometers) to later this year, and other technology partners are declaring much slower development times.

While of all this may not sound like a big deal, it’s a problem for a number of reasons. According to McKinsey, up to 40 per cent of global productivity growth since the 2000s has come from technology made possible by chip performance and price. And there’s another, perhaps more fundamental, concern: the breakdown in Moore’s law comes at a time when academics are predicting the next big development in work (‘Industry 4.0’), which involves big data and the greater use of artificial intelligence. Both of these activities will require faster, smaller, cheaper processors, not slower and more expensive ones.

So are we really reaching a point where we will only be as productive as our PCs allow us?

It’s all about productivity

Some might say the tipping point was reached years ago – because we’ve also seen how workforce productivity has fallen despite Moore’s law being in full swing. That said, there remains some cause for concern. While we might not need our physical mobile devices to run any faster (because they’re now connected to the cloud), it’s this cloud environment that must be faster and more efficient so that workers can draw down and make interpretations of the computations.

Old bugbears still need fixing

But perhaps there is still hope. There’s an argument that what’s most important in mobile computing are things that sound quite basic, but haven’t yet reached a point of completion – things like battery life, how good PCs are at sending and receiving Wi-Fi signals, or the ability of GPS to work wherever they are.

These are basic product improvements that need to run in tandem with much more hardcore processing-power technology, so PC manufacturers that can deliver them will be the big winners.

Managers ultimately want to give their staff equipment that doesn’t crash, that lets other devices plug in easily, that is more reliable and that lasts longer. While computer chip processing power will still need to improve (at a slower rate), getting the basics right will arguably unlock the greatest amount of productivity – at least in the short term.


Building the next-gen data centre

Where traditional and web-scale apps co-exist