The most accurate prediction in the field of computing was made in 1965 by Gordon Moore, then of Fairchild Semiconductor and later a founder of Intel. He made the audacious suggestion that the density of transistors on integrated circuits would double every 18 months. Amazingly, device physicists and process engineers have managed to make good on Moore’s prediction over the past 50 years.
This exponential rate of progress – the fact that everything about computing continues to double at fixed intervals, is what makes it so difficult to predict the future of computing technology. There’s nothing else like this, in any other field. If transportation technology had improved at the same rate as information technology over the past 30 years, then a car would be the size of a mobile phone, cost €100, go 100,000 miles per hour, and travel 150,000 miles on a tank of fuel. A Boeing 747 would cost hundreds of dollars today, rather than hundreds of millions. Ridiculous? Not in computing, where today’s €500 laptop personal computer is vastly more capable than the $3 million building-sized institutional computer of just a few decades ago.
So what do we expect will happen over the next 2-5 years? I believe Big Data will continue to drive consumer behavior and technology evolution. The ability to manage, manipulate and interpret massive quantities of raw data in real-time, converting data into information, and Information into intelligence, is where our focus will need to be.