All the Computing Power in the World, in my Pocket

In either instance, assuming Moore's Law holds true for the next few years, by 2030 it seems I could reasonably expect to hold a computer in my hand at least five hundred times as powerful as the one I have today.

Take your smartphone out of your pocket and have a good look at it. It's a remarkable device.

My phone is not much bigger than an elongated playing card, just over half a centimetre thick and weighs about as much as a pack of cigarettes. Inside however, at the business end of things, it's hard to believe that such a small frame packs a 1.5 GHz dual core processor and 1 GB of RAM. This means in terms of computing power, my new phone is about two thirds as powerful as my home desktop PC.

Thinking back to when I bought my desktop in 2006, the year before the iPhone was launched, I imagine I'd have been quite surprised to learn that I'd be carrying an equivalent device in my pocket wherever I go just six short years later.

Having grown up on 1980s computing, I was particularly pleased to discover an app recently that allows me to play the ZX Spectrum games that I enjoyed in my formative years. After an hour or so spent working out that I'm still terrible at Manic Miner and a playing few other classics, the irony of the situation really dawned on me.

I've no real idea how to work out just how many times more powerful than an actual ZX Spectrum my phone really is, but even just comparing the memory (RAM) it's got at least twenty thousand times the capacity. Trying to imagine twenty thousand Spectrums linked together, condensed into the palm of my hand gave me an odd sensation of futuristic vertigo.

It then occurred to me that at some point in recent history, probably within the last 50 years or so, there must have been a time when the computing power of my current phone was equal to the computing power then available in the entire world.

Although there seem to be a few competitors for the title of the first, what we would call "modern" computing began during the Second World War with machines we would barely recognise as such today. In the 1960s Seymour Cray built the world's first supercomputers, which reached clock speeds of up to 36 MHz by the end of the decade (just doing a simple division of the Hertz, this is forty-two times less than my phone, despite the several million dollar price tag). Not being overly technically minded or up with the jargon, it's quite hard for me to work out exactly when the first single computer with equivalent power to my phone was built, but around the late 1980s or early '90s seems like a reasonable estimate.

This rate of growth in computing was famously summarised in the mid-'60s by Intel co-founder Gordon E. Moore, who described a trend in computing hardware where the number of transistors that can be inexpensively placed on a computer chip doubles every two years. Moore's Law of exponential growth is mirrored in a number of other factors in computing, and so it is now also (somewhat incorrectly) used as a rule of thumb to describe the ever increasing speed and ever decreasing costs of computers in general.

Thinking about this rule and looking at the computing power available today, I wondered whether it would be a reasonable expectation that I would one day hold a device in my hand with the power of a modern supercomputer or perhaps even equivalent to all the computing power in today's world. While the numbers seem astronomical now, that's the thing about exponential growth - it tends to happen quite quickly.

The answer to this question doesn't seem to be that clear. Moore's Law is ultimately restricted by a limitation of scale - eventually you reach a minimum possible size for transistors, probably at the atomic level and at that point you simply can't fit any more into the same area. After that, the only way to increase the speed is to increase the size or number of the chips which then drives up the cost of producing more powerful machines. When this is likely to happen is also an area of some debate, with estimates ranging anywhere from the next few years to at least the 2030s.

Some Futurists, such as Ray Kurzweil, don't see this physical limitation of Moore's Law as a barrier, believing that other forms of computing will arise, such as quantum or DNA-based computers, that will allow us to continue to push the boundaries of potential computing power in other ways. Although still in its infancy, quantum computing seems most likely at present to provide the next big leap with research departments worldwide furiously competing to be the first to solve the next problem.

To me, having grown up through the recent computing revolution, I find it hard to believe that the atomic limitations of Moore's Law will hold us back for too long, if at all. At times it feels as though there's a certain inevitability to our accelerating progress and while there may be obvious benefits, we've never been too good at dealing with the accompanying downsides - smartphones are indeed wonderful things, but they tend to cost the Earth in more ways than one.

In either instance, assuming Moore's Law holds true for the next few years, by 2030 it seems I could reasonably expect to hold a computer in my hand at least five hundred times as powerful as the one I have today. By comparison, our current computers and software will no doubt look as quaint to our future selves as the ZX Spectrum does to us now. What changes these future machines may bring is another matter but, being a child of the 1980s, I'll probably still be trying to find a way to play Manic Miner on them.

Close

What's Hot