Some hope that a move to quantum computing—qubits instead of bits, analog instead of digital—will work wonders, including the invention of the true thinking computer. In last week’s podcast, futurist George Gilder and computer engineer Robert J. Marks looked at, among other things, what’s really happening with quantum computing:
(The quantum computing discussion begins at 15:04.)
Robert J. Marks: What’s your take on quantum computing? It seems to me that there’s been glacial progress in the technology.
George Gilder (pictured): I think quantum computing is rather like AI, in that it moves the actual problem outside the computational process and gives the illusion that it solved the problem, but it’s really just pushed the problem out. Quantum computing is analog computing, that’s what it is. It’s changing primitives of the computation to quantum elements, which are presumably the substance of all matter in the universe.
Note: Quantum computing would use actual quantum elements (qubits) to compute instead of digital signals, thus taking advantage of their subatomic speed. But AI theorists have noted, that doesn’t get around the halting problem (the computer actually doesn’t know what it is doing). That means that a computer still wouldn’t replicate human intelligence. That, in turn, is one reason that “quantum supremacy” can sound a lot like hype.
George Gilder: But still you’ve got to translate the symbols in the world, which in turn have to be translated from the objects in the world, into these qubits, which are quantum entities. Once you’ve defined all these connections and structured the data, then the problem is essentially solved by the process of defining it and inputting it into the computer… but quantum computing again is a very special purpose machine, extremely special purpose. Because everything has to be exactly structured right for it.
Robert J. Marks: Yeah, that’s my point. I think that once we get quantum computing and if it works well, we can also do quantum encryption, which quantum computing can’t decode. So that’s the next step. So yeah, that’s fascinating stuff.
The qubit is one of the most enigmatic tangles of matter and ghost in the entire armament of physics. Like a binary digit, it can register 0 or 1; what makes it quantum is that it can also register a nonbinary “superposition” of 0 and 1.
In 1989 I published a book, Microcosm, with the subtitle The Quantum Era in Economics and Technology. Microcosm made the observation that all computers are quantum machines in that they shun the mechanics of relays, cogs, and gears, and manipulate matter from the inside following quantum rules. But they translate all measurements and functions into rigorous binary logic—every bit is 1 or 0. At the time I was writing Microcosm, a few physicists were speculating about a computer that used qubits rather than bits, banishing this translation process and functioning directly in the quantum domain. (P. 39)
The quantum world impinges on computer technology whether we like it or not:
For example, today the key problem in microchips is to avoid spontaneous quantum tunneling, where electrons can find themselves on the other side of a barrier that by the laws of classical physics would have been insurmountable and impenetrable. In digital memory chips or processors, spontaneous tunneling can mean leakage and loss. In a quantum computer, though, such quantum effects may endow a portfolio of features, providing a tool or computational “primitive” that enables simulation of a world governed by quantum rules. (p. 40)
Quantum rules, while strange, might insure the integrity of a connection because entangled quantum particles respond to each other no matter how far they are separated:
A long-ago thought experiment of Einstein’s showed that once any two photons—or other quantum entities—interact, they remain in each other’s influence no matter how far they travel across the universe (as long as they do not interact with something else). Schrödinger christened this “entanglement”: The spin—or other quantum attribute—of one behaves as if it reacts to what happens to the other, even when the two are impossibly remote. (p. 40)
So, apart from interaction, no one can change only the data on their side without it being noticed…
Underlying all this heady particle physics and quantum computing speculations is actually a philosophical shift. As Gilder puts it in Gaming AI,
John Wheeler provocatively spoke of “it from bit” and “the elementary act of observer-participancy”: “in short… all things physical are information-theoretic in origin and this is a participatory universe.”(p. 41)
Which is another way of saying that in reality information, rather than matter and energy, rules our universe.
Also discussed in last week’s podcast (with links to the series and transcripts):
While the West hesitates, China is moving to blockchain. Life After Google by George Gilder, advocating blockchain, became a best seller in China and received a social sciences award. George Gilder, also the author of Gaming AI, explains why Bitcoin might not do as well as blockchain in general, as a future currency source.
You may also enjoy: Will quantum mechanics produce the true thinking computer. Quantum computers come with real world problems of their own.
Why AI geniuses haven’t created true thinking machines. The problems have been hinting at themselves all along.
Next: What’s the future for carbon computing?