Most people involved in the world of computing are aware of Moore’s Law, posited by the eponymous Gordon Moore in the 1960’s, which states that computing power would double every 18 months.

Look back at how he was proved correct since the introduction of the first microprocessor (Intel’s 4004 in 1971) and you find that the improvements have been staggering. If the same advances had been achieved with the motor car we’d all be doing 10,000 mpg. Truly mind-bending. But there is a problem: as Scotty from Star Trek would say, “You can’t change the laws of physics.”

And as we have squeezed more and more onto our microprocessors we have reached a point where quantum physical effects pose very real engineering problems. Whilst the semiconductor industry struggles to find new materials and methods of squeezing that last bit onto the available real-estate, everyone recognises that the end of the road is in sight for “classical” computer improvements as we have known them over the past 30 years.

However, on the principle that if you can’t beat them, join them, the laws of quantum physics are being increasingly used to produce a new generation of computers: quantum computers. The theory goes all the way back to the 1980’s when Richard Feynman was discussing how quantum systems could be simulated, and David Deutsch realised that the approach could be used to produce a general purpose computer based using quantum states as quantum bit (known as qubits) rather than classical bits (cbits). Since then there has been a great deal of theoretical work developing the science of quantum computing. Sadly, there was always the non-trivial problem of how to implement a quantum computer.

When people became aware of the potential power of quantum computers there was much excitement. Not least that algorithms were being developed for quantum computers which could potentially solve problems that no classical computer would ever manage. However, most have become rather jaded over the years as a practical quantum computer has failed to make an appearance. The engineering has not kept up with the science, and quantum computing has become something of a standing joke amongst journalists when new announcements are made.

But, the last 12 months have seen some significant advances, and the last two months have seen what might just be pivotal developments. The race is back on and in addition to university laboratories; we have large corporations investing in research on the implementation of quantum computing. We even have a Canadian company (D-Wave) who will sell you what they class as the world’s first commercial quantum computer, assuming that is that you have a spare $10million. Purists will tell you that D-Wave’s offering is not a “real” quantum computer (it uses a process called quantum annealing) but within the last weeks researchers have used the latest D-Wave computer to solve the “protein folding” problem(which is a little like the travelling salesman problem) which was previously impractical with any classical computer that we can envisage.

Within the last month we have seen announcements fromBristol Universitythat they have successfully proven that optical based quantum computing can be integrated onto a silicon chip. We’ve seen others using exotic techniquesto produce at room temperature quantum computing fundamentals that previously have required massive supercooling apparatus. What was once science fiction is leaving the laboratory bench and being turned into viable devices.

**4 Qubits Processor Used to Factor 15**

Amongst the announcements was one from University of California (Santa Barbara). It attracted some derision with headlines such as “Quantum computer calculates that 15 = 5x 3 half the time”. However, it’s worth taking a moment to understand what was achieved. First, a number was factorised into its two prime number constituents.

Second, this was done using a variant of what is known as Shor’s algorithm, which when scaled up barely break into a sweat whilst our best classical algorithm (the General Number Field Sieve) would take longer than the age of the universe to compute an answer. Thirdly, the fact that it produced the correct answer only some of the time doesn’t actually matter because whilst factoring a number to its constituent primes takes a long time, proving that those primes are the true constituents is relatively trivial: just multiply the answer you get and see if it is your large number.

This result is significant because much of our online security relies upon the fact that for numbers 600-700 digits long, calculating the prime numbers is infeasible in any meaningful timescale. Whilst 15 may not be 600 digits long, how long would you like to bet it will before the methods can be scaled up, and Shor’s algorithm makes it trivial to eavesdrop on your public key encrypted web traffic? As 2012 and 2013 see the emergence of real, practical computing based upon quantum phenomena, a quick look back 30 years and you’ll see that the basis of our current web security is unlikely to suffice for more than a few years.

However, all is not doom and gloom. Quantum phenomenon may actually hold the key to providing security that even a quantum computer cannot break. The main reason that Diffie & Hellman produced their ideas on public key cryptography, and RSA implemented so widely, was because passing the secure key used in symmetric cryptography was so difficult on the scales we must handle on the Internet. Quantum Key Distribution (QKD)enables a trustworthy channel for shared keys to be handled, and QKD is very much a working technology with commercial products available, and even some cities setting up the infrastructure to enable it to be widely used.

One way or another quantum information processing is going to have a significant impact on the security landscape over the next few years. I suspect our grandchildren will find it commonplace just as we now find personal computers and the Internet an everyday item.

*Cross-posted from Professor Alan Woodward*