What is Quantum Computing?
Quantum computing is set to change the world in the future and promises computing power far in excess of every computer ever built, but despite its great promise, it's not widely available or particularly useful yet.
Adverts Blocked Please disable AdBlocking software and allow me to set cookies so that I can continue providing free content and services.
Firstly I will explain a little bit about how traditional computers work (no pun intended!) and how things compare with a quantum computer. I'll also define a few terms.
In computing, a 'bit' is a term used to identify the smallest amount of information that a computer can store. It represents an electrical state which is either on or off and is commonly stated as being either 1 or 0. There are only two possible values that can be represented by a 1-bit system. To make things easier to understand, let's imagine that a bit is a coin - the coin can only be a head (1) or a tail (0).
A 2-bit system consists of two coins (bits). With two coins, there is a total of four combinations - head+head, tail+head, head+tail, tail+tail. 3-bit systems have a total combination of 9 possible values, and so on.
A 'byte' is a representation of 8-bits and has a total of 256 possible combinations, but a single byte can only represent one combination at any given time.
A Quantum Bit (or qubit) has the remarkable property of being able to hold a value of 1 or 0 or any value in between. In this state, the coin can either be heads, tails or any ratio of head:tail, meaning that (prepare to have your mind blown) a qubit can simultaneously hold every value at the same time. A qubyte, therefore, can hold every value between 0 and 255 at the same time. Just 100 qubits can store 1x1030 different numbers, which is many trillion times the storage capacity of all the computer storage ever made.
Now, thanks to Schroedinger and his feline friend, once a qubit is measured it take on one of the two states, heads or tails.
Let's have a look at exactly what a bit is. I've already told you that it is a 1 or a 0, head or a tail, on or off, but what exactly is this in physical terms.
In the earliest non-electronic information processing devices, such as Babbage's Analytical Engine, a bit was 'stored' as the position of a mechanical lever or gear, and later in the presence or absence of a punch hole at a specific point of a paper card or tape. Later they were represented by magnetic polarity. Bits are easy to make, they can be large or small, and take on permanent or temporary mediums.
Qubits, on the other hand, are very much more complex and elusive. In fact, nobody is even exactly sure on the best method to create them. Some methods involve trapping ions, electrons or other tiny little particles; some propose using superconductors to create microscopic quantum circuits; others suggest it might be possible to use photons and complex optical apparatus to achieve a similar goal. What these techniques all have in common, however, is the fact that they're currently plausible on the small scale but incredibly difficult to realise on the large.
The problems don't end once you have made a few qubits. As Schroedinger pointed out, quantum systems need to be isolated from the rest of world in order to work. If qubits were to interact with the external world, they would de-cohere, collapse down and take on a binary state, just like a traditional computer.
Current thinking is that it is best to decide on an acceptable error rate, and design around it. With enough qubits, a small error rate can still outperform normal computing power. Trouble is that qubits are hard to produce, making them very expensive and for now, only available to research machines.
Using a Quantum Computer
Let's for argument's sake say that you have a working quantum computer, with all its quantum power at your fingertips. You may be expecting to be able to run Windows very quickly, launch multiple applications instantly and with ease. But you would be very wrong. In fact, it would be a nightmare.
The first problem you will encounter is that there isn't really any way of knowing if it is working properly, since observing a quantum system changes the outcome (thanks to Schroedinger again). In fact, the currently available quantum computers aren't verified to be working the way they're supposed to. They're simply based on the right theory, some fingers crossing, and judged by their output.
How would you go about programming a quantum computer? Our current programming logic would have to be scrapped as any answers you get out of one are going to be based on probability, not a definitive value. Even a variable comparison could be ambiguous, for example:
if (variable1 == variable2) is the most basic programming comparison and traditionally the result is either true or false. What would a programmer do if the comparison returned possibly true, maybe false, or something else?
It is likely that a calculation would need to be repeated multiple times to get a "most probable" answer. If you have to run a calculation multiple times the overall time taken increases and is it worth using a quantum computer in this event?
The next point reminds me of Deep Thought in Hitch-hikers Guide to the Galaxy (Douglas Adams, 1978) How do you test the resulting answer from a quantum computer?
Depending on the problem, you cannot know if the answer is correct! In theory, a quantum computer can solve a calculation in hours, which would take a normal computer thousands of years to work out. So how do you know if the answer is correct if there is no way of verifying it?
Quantum computing is still in its early infancy, in fact, compared to the traditional computing timeline, our understanding of quantum computing is probably around the abacus stage of development. We are still a long way off of having a functional quantum computer, but progress is being rapidly made.
Update, May 2013
Google announces the Quantum Artificial Intelligence Lab based in NASA's Ames Research Center to study how quantum computing might advance machine learning. Press release here.
Last updated on: Saturday 17th June 2017
Privacy Matters. It's important and is gradually being eroded into an Orwellian vision
There are no comments for this post. Be the first!