Amazingly, Moore's Law has held for 50 years. A corollary: Moore's Law sparked new devices in every decade, from mainframes (1970), to personal computers (1980s), laptops (1990s) and smartphones (2000). Computers have got much smaller and yet, far more powerful. The 1980s super-computers occupied entire floors in large buildings but any smartphone crunches numbers faster than those behemoths. Moore's Law has been taken for granted because it held up for so long. However, it is not an immutable scientific principle. Chips have already hit one wall in that speeds cannot be boosted without over-heating. Other limits imposed by quantum mechanics are likely to come into play soon. A modern chip contains billions of transistors. About 400 billion-billion (4x10 followed by 19 zeros) transistors were manufactured in 2015. Each transistor is smaller than a typical virus. The distance between two transistors on a chip is 14 nanometres (a nanometre is a billionth of a metre). The uncertainties of quantum effects will kick in, at five nanometres or so. Engineers will no longer be certain where electrons are, or how quickly currents are travelling. Effects like "tunnelling" and "entanglement", where particles mysteriously cross barriers and affect each other instantly, will arise.
Chip makers must seek new design solutions. Software programmers can no longer rely on increases in raw processing power. They will have to write code to use processing resources more efficiently. Some hardware manufacturers will gravitate to making specialised chips to address specific needs. That means a paradigm of integrated design where programmers and hardware engineers work closely together. Graphics chips for example, are already specialised. Fabricating chips to address other specific needs like virtual reality or specific mathematical calculations is one way to go. Another approach will be to look for new materials and methods that could enable chips to run cooler or quicker. For example, there is research involving "super-carbon" graphene, and alloys like Silicon-Germanium rather than silicon. Experiments in optical computing, which uses light to carry information, rather than electricity, are also attracting large R&D budgets. Another line of research involves embracing quantum effects, rather than trying to avoid them. Quantum computing could have many advantages in theory. But researchers are a long way from solving the counter-intuitive problems that arise in manipulating quantum bits (qubits) of information.
From the point of view of serving end-users, there are two big new markets. One is the "Internet of things". This envisages that chips and sensors can be attached to more or less every sort of appliance, ranging from washing machines and air conditioners to traffic lights. The other big market is cloud computing. Rather than demanding humongous computing power in her pocket, the consumer wants fast access to the cloud. That has boosted demand from data centres hosting clouds, and from broadband infrastructure providers. Whatever they do, chip makers must meet the demand of such end-users to stay in the business. The end of Moore's Law will mean the end of one era of computing. But it could unleash a new wave of creativity where an entirely different set of players come to the fore.