The Fabric of Reality David Deutch


participants with unpredictable patterns of communication. It is impractical to


Download 1.42 Mb.
Pdf ko'rish
bet35/53
Sana18.06.2023
Hajmi1.42 Mb.
#1597749
1   ...   31   32   33   34   35   36   37   38   ...   53
Bog'liq
The Fabric of Reality


participants with unpredictable patterns of communication. It is impractical to
require every pair of participants, physically and in advance, to exchange
secret cryptographic keys that would allow them later to communicate
without fear of eavesdroppers. 
Public-key cryptography is any method of
sending secret information where the sender and recipient do not already
share any secret information. The most secure known method of public-key
cryptography depends on the intractability of the problem of factorizing large
numbers. This method is known as the RSA cryptosystem, named after
Ronald Rivest, Adi Shamir and Leonard Adelman, who first proposed it in
1978. It depends on a mathematical procedure whereby a message can be
encoded using a large (say, 250-digit) number as a key. The recipient can
freely make this key public, because any message encoded with it can only
be decoded given a knowledge of the factors of that number. Thus I can
choose two 125-digit prime numbers and keep them secret, but multiply
them together and make their 250-digit product public. Anyone can send me
a message using that number as the key, but only I can read the messages
because only I know the secret factors.
As I have said, there is no practical prospect of factorizing 250-digit numbers
by classical means. But a quantum factorization engine running Shor’s
algorithm could do it using only a few thousand arithmetic operations, which
might well take only a matter of minutes. So anyone with access to such a
machine would easily be able to read any intercepted message that had
been encrypted using the RSA cryptosystem.
It would do the cryptographers no good to choose larger numbers as keys
because the resources required by Shor’s algorithm increase only slowly
with the size of the number being factorized. In the quantum theory of


computation, factorization is a very tractable task. It is thought that, in the
presence of a given level of decoherence, there would again be a practical
limit on the size of number that could be factorized, but there is no known
lower limit on the rate of decoherence that can be technologically achieved.
So we must conclude that one day in the future, at a time that cannot now be
predicted, the RSA cryptosystem with any given length of key may become
insecure. In a certain sense, that makes it insecure even today. For anyone,
or any organization, that records an RSA-encrypted message today, and
waits until they can buy a quantum factorization engine with low enough
decoherence, will be able to decode the message. That may not happen for
centuries, or it may be only decades — perhaps less, who can tell? But the
likelihood that it will be rather a long time is all that now remains of the
former complete security of the RSA system.
When a quantum factorization engine is factorizing a 250-digit number, the
number of interfering universes will be of the order of 10 500 — that is, ten to
the power of 500. This staggeringly large number is the reason why Shor’s
algorithm makes factorization tractable. I said that the algorithm requires
only a few thousand arithmetic operations. I meant, of course, a few
thousand operations 
in each universe that contributes to the answer. All
those computations are performed in parallel, in different universes, and
share their results through interference.
You may be wondering how we can persuade our counterparts in 10 500-
odd universes to start working on our factorization task. Will they not have
their own agendas for the use of their computers? No — and no persuasion
is necessary. Shor’s algorithm operates initially only on a set of universes
that are 
identical to one another, and it causes them to become
differentiated only within the confines of the factorization engine. So we, who
specified the number to be factorized, and who wait while the answer is
computed, are identical in all the interfering universes. There are, no doubt,
many other universes in which we programmed different numbers or never
built the factorization engine at all. But those universes differ from ours in too
many variables — or more precisely, in variables that are not made to
interact in the right way by the programming of Shor’s algorithm — and so do
not interfere with our universe.
The argument of Chapter 2, applied to 
any interference phenomenon
destroys the classical idea that there is only one universe. Logically, the
possibility of complex quantum computations adds nothing to a case that is
already unanswerable. But it does add psychological impact. With Shor’s
algorithm, the argument has been writ very large. To those who still cling to a
single-universe world-view, I issue this challenge: 
explain how Shor’s
algorithm works. I do not merely mean predict that it will work, which is
merely a matter of solving a few uncontroversial equations. I mean provide
an explanation. When Shor’s algorithm has factorized a number, using 10 
500 or so times the computational resources that can be seen to be
present, where was the number factorized? There are only about 10 80 
atoms in the entire visible universe, an utterly miniscule number compared
with 10 500 . So if the visible universe were the extent of physical reality,
physical reality would not even remotely contain the resources required to
factorize such a large number. Who did factorize it, then? How, and where,
was the computation performed?


I have been discussing traditional types of mathematical task that quantum
computers would be able to perform more quickly than existing machines.
But there is an additional class of new tasks open to quantum computers
that no classical computer could perform at all. By a strange coincidence,
one of the first of these tasks to be discovered also concerns public-key
cryptography. This time it is not a matter of breaking an existing system, but
of implementing a new, absolutely secure system of 
quantum cryptography.
In 1989, at IBM Research, Yorktown Heights, New York, in the office of the
theoretician Charles Bennett, the first working quantum computer was built.
It was a special-purpose quantum computer consisting of a pair of quantum
cryptographic devices designed by Bennett and Gilles Brassard of the
University of Montreal. It became the first machine ever to perform non-trivial
computations that no Turing machine could perform.
In Bennett and Brassard’s quantum cryptosystem, messages are encoded in
the states of individual photons emitted by a laser. Although many photons
are needed to transmit a message (one photon per bit, plus many more
photons wasted in various inefficiencies), the machines can be built with
existing technology because they need to perform their quantum
computations on only one photon at a time. The system’s security is based
not on intractability, either classical or quantum, but directly on the properties
of quantum interference: that is what gives it its classically unobtainable
absolute security. No amount of future computation by any sort of computer,
whether for millions or trillions of years, would be of any help to an
eavesdropper on quantum-encrypted messages: for if one communicates
through a medium exhibiting interference, 
one can detect eavesdroppers.
According to classical physics, there is nothing that can prevent an
eavesdropper who has physical access to a communication medium, such
as a telephone line, from installing a passive listening device. But, as I have
explained, if one makes any measurement on a quantum system one alters
its subsequent interference properties. The communication protocol relies on
this effect. The communicating parties effectively set up repeated
interference experiments, co-ordinating them over a public communication
channel. Only if the interference passes the test for there having been no
eavesdropper do they pass on to the next stage of the protocol, which is to
use some of the transmitted information as a cryptographic key. At worst,
persistent eavesdropper might prevent any communication from taking place
at all (though of course that is more easily achieved just by cutting the
telephone line). But as for reading a message, only the intended recipient
can do that, and the guarantee of that is provided by the laws of physics.
Because quantum cryptography depends on manipulating individual
photons, it suffers from a major limitation. Each photon that is successfully
received, carrying one bit of the message, must somehow have been
transmitted intact from the transmitter to the receiver. But every method of
transmission involves losses, and if these are too heavy the message will
never arrive. Setting up relay stations (which is the remedy for this problem
in existing communication systems) would compromise the security because
an eavesdropper could, without being detected, monitor what goes on inside
the relay station. The best existing quantum-cryptographic systems use
fibre-optic cables and have a range of about ten kilometres. This would
suffice to provide, say, the financial district of a city with absolutely secure


internal communications. Marketable systems may not be far away, but to
solve the problem of public-key cryptography in general — say, for global
communication — further advances in quantum cryptography are required.
Experimental and theoretical research in the field of quantum computation is
accelerating world-wide. Ever more promising new technologies for realizing
quantum computers are being proposed, and new types of quantum
computation with various advantages over classical computation are
continually being discovered and analysed. I find all these developments
very exciting, and I believe that some of them will bear technological fruit.
But as far as this book is concerned, that is a side-issue. From a
fundamental standpoint it does not matter how useful quantum computation
turns out to be, nor does it matter whether we build the first universal
quantum computer next week, or centuries from now, or never. The quantum
theory of computation must in any case be an integral part of the world-view
of anyone who seeks a fundamental understanding of reality. What quantum
computers tell us about connections between the laws of physics,
universality, and apparently unrelated strands of explanation of the fabric of
reality, we can discover — and are already discovering — by studying them
theoretically.
TERMINOLOGY
quantum computation Computation that requires quantum-mechanical
processes, especially interference. In other words, computation that is
performed in collaboration between parallel universes.
exponential computation A computation whose resource requirements (such
as the time required) increase by a roughly constant factor for each
additional digit in the input.
tractable/intractable (Rough-and-ready rule) A computational task is
deemed tractable if the resources required to perform it do not increase
exponentially with the number of digits in the input.
chaos The instability in the motion of most classical systems. A small
difference between two initial states gives rise to exponentially growing
deviations between the two resulting trajectories. But reality obeys quantum
and not classical physics. Unpredictability caused by chaos is in general
swamped by quantum indeterminacy caused by identical universes
becoming different.
universal quantum computer A computer that could perform any
computation that any other quantum computer could perform, and render
any finite, physically possible environment in virtual reality.
quantum cryptography Any form of cryptography that can be performed by
quantum computers but not by classical computers.
special-purpose quantum computer A quantum computer, such as a
quantum cryptographic device or quantum factorization engine, that is not a
universal quantum computer.
decoherence If different branches of a quantum computation, in different
universes, affect the environment differently, then interference is reduced
and the computation may fail. Decoherence is the principal obstacle to the


practical realization of more powerful quantum computers.
SUMMARY
The laws of physics permit computers that can render every physically
possible environment without using impractically large resources. So
universal computation is not merely possible, as required by the Turing
principle, it is also tractable. Quantum phenomena may involve vast
numbers of parallel universes and therefore may not be capable of being
efficiently simulated within one universe. However, this strong form of
universality still holds because quantum computers can efficiently render
every physically possible quantum environment, even when vast numbers of
universes are interacting. Quantum computers can also efficiently solve
certain mathematical problems, such as factorization, which are classically
intractable, and can implement types of cryptography which are classically
impossible. Quantum computation is a qualitatively new way of harnessing
nature.
The next chapter is likely to provoke many mathematicians. This can’t be
helped. Mathematics is not what they think it is.
(Readers who are unfamiliar with traditional assumptions about the certainty
of mathematical knowledge may consider the chapter’s main conclusion —
that our knowledge of mathematical truth depends on, and is no more
reliable than, our knowledge of the physical world — to be obvious. Such
readers may prefer to skim this chapter and hurry on to the discussion of
time in Chapter 11 .)



Download 1.42 Mb.

Do'stlaringiz bilan baham:
1   ...   31   32   33   34   35   36   37   38   ...   53




Ma'lumotlar bazasi mualliflik huquqi bilan himoyalangan ©fayllar.org 2024
ma'muriyatiga murojaat qiling