The theory of relativity deals with the geometric
structure of a four-dimensional spacetime. Quantum mechanics
describes properties of matter. Combining these
two theoretical edifices is a difficult proposition. For example,
there is no way of defining a relativistic proper
time for a quantum system which is spread all over
space. A proper time can in principle be defined for a
massive apparatus (‘‘observer’’) whose Compton wavelength
is so small that its center of mass has classical
coordinates and follows a continuous world line. However,
when there is more than one apparatus, there is no
role for the private proper times that might be attached
to the observers’ world lines. Therefore a physical situation
involving several observers in relative motion cannot
be described by a wave function with a relativistic
transformation law (Aharonov and Albert, 1981; Peres,
1995, and references therein). This should not be surprising
because a wave function is not a physical object.
It is only a tool for computing the probabilities of objective
macroscopic events.
 
Einstein’s [special] principle of relativity asserts that there are
no privileged inertial frames. 
 
[Comment #3: Einstein's general principle of relativity is that there are no privileged local accelerating frames (AKA LNIFs). In addition, Einstein's equivalence principle is that one can always find a local inertial frame (LIF) coincident with a LNIF (over a small enough region of 4D space-time) in which to a good approximation, Newton's 1/r^2 force is negligible "Einstein's happiest thought" Therefore, Newton's universal "gravity force" is a purely inertial, fictitious, pseudo-force exactly like Coriolis, centrifugal and Euler forces that are artifacts of the proper acceleration of the detector having no real effect on the test particle being measured by the detector. The latter assumes no rigid constraint between detector and test particle. For example a test particle clamped to the edge r of a uniformly slowly rotating disk will have a real EM force of constraint that is equal to m w x w x r.]
 
This does not imply the
necessity or even the possibility of using manifestly symmetric
four-dimensional notations. This is not a peculiarity
of relativistic quantum mechanics. Likewise, in classical
canonical theories, time has a special role in the
equations of motion.
 
The relativity principle is extraordinarily restrictive.
For example, in ordinary classical mechanics with a finite
number of degrees of freedom, the requirement that
the canonical coordinates have the meaning of positions,
so that particle trajectories q(t) transform like
four-dimensional world lines, implies that these lines
consist of straight segments. Long-range interactions are
forbidden; there can be only contact interactions between
point particles (Currie, Jordan, and Sudarshan,
1963; Leutwyler, 1965). Nontrivial relativistic dynamics
requires an infinite number of degrees of freedom,
which are labeled by the spacetime coordinates (this is
called a field theory).
 
Combining relativity and quantum theory is not only a
difficult technical question on how to formulate dynamical
laws. The ontologies of these theories are radically
different. Classical theory asserts that fields, velocities,
etc., transform in a definite way and that the equations
of motion of particles and fields behave covariantly. …
 
For example, if the expression for the Lorentz force is written
...in one frame, the same expression is valid
in any other frame. These symbols …. have objective
values. They represent entities that really exist, according
to the theory. On the other hand, wave functions
are not defined in spacetime, but in a multidimensional
Hilbert space. They do not transform covariantly when
there are interventions by external agents, as will be
seen in Sec. III. Only the classical parameters attached
to each intervention transform covariantly. Yet, in spite
of the noncovariance of r, the final results of the calculations
(the probabilities of specified sets of events) must
be Lorentz invariant.
 
As a simple example, consider our two observers, conventionally
called Alice and Bob,4 holding a pair of spin-1/2
particles in a singlet state. Alice measures sand finds
+1, say. This tells her what the state of Bob’s particle is,
namely, the probabilities that Bob would obtain + or - 1 if he
measures (or has measured, or will measure) s along
any direction he chooses. This is purely counterfactual
information: nothing changes at Bob’s location until he
performs the experiment himself, or receives a message
from Alice telling him the result that she found. In particular,
no experiment performed by Bob can tell him
whether Alice has measured (or will measure) her half
of the singlet.
 
A seemingly paradoxical way of presenting these results
is to ask the following naive question. Suppose that
Alice finds that sz = 1 while Bob does nothing. When
does the state of Bob’s particle, far away, become the
one for which sz = -1 with certainty? Although this
question is meaningless, it may be given a definite answer:
Bob’s particle state changes instantaneously. In
which Lorentz frame is this instantaneous? In any
frame! Whatever frame is chosen for defining simultaneity,
the experimentally observable result is the same, as
can be shown in a formal way (Peres, 2000b). Einstein
himself was puzzled by what seemed to be the instantaneous
transmission of quantum information. In his autobiography,
he used the words ‘‘telepathically’’ and
‘‘spook’’ (Einstein, 1949). …
 
In the laboratory, any experiment
has to be repeated many times in order to infer a
law; in a theoretical discussion, we may imagine an infinite
number of replicas of our gedanken experiment, so
as to have a genuine statistical ensemble. Yet the validity
of the statistical nature of quantum theory is not restricted
to situations in which there are a large number
of similar systems. Statistical predictions do apply to
single eventsWhen we are told that the probability of
precipitation tomorrow is 35%, there is only one tomorrow.
This tells us that it may be advisable to carry an
umbrella. Probability theory is simply the quantitative
formulation of how to make rational decisions in the
face of uncertainty (Fuchs and Peres, 2000). A lucid
analysis of how probabilistic concepts are incorporated
into physical theories is given by Emch and Liu (2002).
 
[My comment #4: Peres is correct, but there is no conflict with Bohm's ontological
interpretation here. The Born probability rule is not fundamental to quantum reality
in Bohm's view, but is a limiting case when the beables are in thermal equilibrium.]
 
...
 
Some trends in modern quantum information theory
may be traced to security problems in quantum communication.
A very early contribution was Wiesner’s seminal
paper ‘‘Conjugate Coding,’’ which was submitted
circa 1970 to IEEE Transactions on Information Theory
and promptly rejected because it was written in a jargon
incomprehensible to computer scientists (this was actually
a paper about physics, but it had been submitted to
a computer science journal). Wiesner’s article was finally
published (Wiesner, 1983) in the newsletter of ACM
SIGACT (Association for Computing Machinery, Special
Interest Group in Algorithms and Computation
Theory). That article tacitly assumed that exact duplication
of an unknown quantum state was impossible, well
before the no-cloning theorem (Dieks, 1982; Wootters
and Zurek, 1982) became common knowledge. Another
early article, ‘‘Unforgeable Subway Tokens’’ (Bennett
et al., 1983) also tacitly assumed the same.
 
II. THE ACQUISITION OF INFORMATION
A. The ambivalent quantum observer
Quantum mechanics is used by theorists in two different
ways. It is a tool for computing accurate relationships
between physical constants, such as energy levels,
cross sections, transition rates, etc. These calculations
are technically difficult, but they are not controversial.
In addition to this, quantum mechanics also provides
statistical predictions for results of measurements performed
on physical systems that have been prepared in a
specified way. 
 
[My comment #5: No mention of Yakir Aharonov's intermediate present "weak measurements"
with both history past pre-selection and destiny future post-selection constraints. The latter in
Wheeler delayed choice mode would force the inference of real back-from-the-future retrocausality.
This would still be consistent with Abner Shimony's "passion at a distance," i.e. "signal locality"
in that the observer at the present weak measurement would not know what the future constraint 
actually will be. In contrast, with signal non locality (Sarfatti  1976 MIT Tech Review (Martin Gardner) & 
Antony Valentini (2002)) such spooky precognition would be possible as in Russell Targ's reports on 
CIA funded RV experiments at SRI in the mid 70's and 80's. 
This is, on the face of it, a gross violation of orthodox
quantum theory as laid out here in the Peres review paper.]
 
The quantum measuring process is the interface
of classical and quantum phenomena. The preparation
and measurement are performed by macroscopic
devices, and these are described in classical terms. The
necessity of using a classical terminology was emphasized
by Niels Bohr (1927) from the very early days of
quantum mechanics. Bohr’s insistence on a classical description
was very strict. He wrote (1949)
 
‘‘ . . . by the word ‘experiment’ we refer to a situation
where we can tell others what we have done and what
we have learned and that, therefore, the account of the
experimental arrangement and of the results of the observations
must be expressed in unambiguous language,
with suitable application of the terminology of
classical physics.’’
 
Note the words ‘‘we can tell.’’ Bohr was concerned
with information, in the broadest sense of this term. He
never said that there were classical systems or quantum
systems. There were physical systems, for which it was
appropriate to use the classical language or the quantum
language. There is no guarantee that either language
gives a perfect description, but in a well-designed experiment
it should be at least a good approximation.
 
Bohr’s approach divides the physical world into ‘‘endosystems’’
(Finkelstein, 1988), which are described by
quantum dynamics, and ‘‘exosystems’’ (such as measuring
apparatuses), which are not described by the dynamical
formalism of the endosystem under consideration.
A physical system is called ‘‘open’’ when parts of
the universe are excluded from its description. In different
Lorentz frames used by observers in relative motion,
different parts of the universe may be excluded. The
systems considered by these observers are then essentially
different, and no Lorentz transformation exists
that can relate them (Peres and Terno, 2002).
It is noteworthy that Bohr never described the measuring
process as a dynamical interaction between an
exophysical apparatus and the system under observation.
He was, of course, fully aware that measuring apparatuses
are made of the same kind of matter as everything
else, and they obey the same physical laws. It is
therefore tempting to use quantum theory in order to
investigate their behavior during a measurement. However,
if this is done, the quantized apparatus loses its
status as a measuring instrument. It becomes a mere intermediate
system in the measuring process, and there
must still be a final instrument that has a purely classical
description (Bohr, 1939).
 
Measurement was understood by Bohr as a primitive
notion. He could thereby elude questions which caused
considerable controversy among other authors. A
quantum-dynamical description of the measuring process
was first attempted by John von Neumann in his
treatise on the mathematical foundations of quantum
theory (1932). In the last section of that book, as in an
afterthought, von Neumann represented the apparatus
by a single degree of freedom, whose value was correlated
with that of the dynamical variable being measured.
Such an apparatus is not, in general, left in a definite
pure state, and it does not admit a classical
description. Therefore von Neumann introduced a second
apparatus which observes the first one, and possibly
a third apparatus, and so on, until there is a final measurement,
which is not described by quantum dynamics
and has a definite result (for which quantum mechanics
can give only statistical predictions). The essential point
that was suggested, but not proved by von Neumann, is
that the introduction of this sequence of apparatuses is
irrelevant: the final result is the same, irrespective of the
location of the ‘‘cut’’ between classical and quantum
physics.8
 
These different approaches of Bohr and von Neumann
were reconciled by Hay and Peres (1998), who
8At this point, von Neumann also speculated that the final
step involves the consciousness of the observer—a bizarre
statement in a mathematically rigorous monograph (von Neumann,
1955).
 
B. The measuring process
Dirac (1947) wrote that ‘‘a measurement always
causes the system to jump into an eigenstate of the dynamical
variable being measured.’’ Here, we must be
careful: a quantum jump (also called a collapse) is something
that happens in our description of the system, not
to the system itself. Likewise, the time dependence of
the wave function does not represent the evolution of a
physical system. It only gives the evolution of probabilities
for the outcomes of potential experiments on that
system (Fuchs and Peres, 2000).
 
Let us examine more closely the measuring process.
First, we must refine the notion of measurement and
extend it to a more general one: an interventionAn
intervention is described by a set of parameters which
include the location of the intervention in spacetime, referred
to an arbitrary coordinate system. We also have
to specify the speed and orientation of the apparatus in
the coordinate system that we are using as well as various
other input parameters that control the apparatus,
such as the strength of a magnetic field or that of a rf
pulse used in the experiment. The input parameters are
determined by classical information received from past
interventions, or they may be chosen arbitrarily by the
observer who prepares that intervention or by a local
random device acting in lieu of the observer.
 
[My comment #6: Peres, in my opinion, makes another mistake.
Future interventions will affect past weak measurements.
 

Back From the Future

A series of quantum experiments shows that measurements performed in the future can influence the present. Does that mean the universe has a destiny—and the laws of physics pull us inexorably toward our prewritten fate?

By Zeeya Merali|Thursday, August 26, 2010
http://discovermagazine.com/2010/apr/01-back-from-the-future#.UieOnhac5Hw ]
 
An intervention has two consequences. One is the acquisition
of information by means of an apparatus that
produces a record. This is the ‘‘measurement.’’ Its outcome,
which is in general unpredictable, is the output of
the intervention. The other consequence is a change of
the environment in which the quantum system will
evolve after completion of the intervention. For example,
the intervening apparatus may generate a new
Hamiltonian that depends on the recorded result. In particular,
classical signals may be emitted for controlling
the execution of further interventions. These signals are,
of course, limited to the velocity of light.
The experimental protocols that we consider all start
in the same way, with the same initial state ... , and the
first intervention is the same. However, later stages of
the experiment may involve different types of interventions,
possibly with different spacetime locations, depending
on the outcomes of the preceding events. Yet,
assuming that each intervention has only a finite number
of outcomes, there is for the entire experiment only a
finite number of possible records. (Here, the word
record means the complete list of outcomes that occurred
during the experiment. We do not want to use the
word history, which has acquired a different meaning in
the writings of some quantum theorists.)
 
Each one of these records has a definite probability in
the statistical ensemble. In the laboratory, experimenters
can observe its relative frequency among all the records
that were obtained; when the number of records tends
to infinity, this relative frequency is expected to tend to
the true probability. The aim of theory is to predict the
probability of each record, given the inputs of the various
interventions (both the inputs that are actually controlled
by the local experimenter and those determined
by the outputs of earlier interventions). Each record is
objective: everyone agrees on what happened (e.g.,
which detectors clicked). Therefore, everyone agrees on
what the various relative frequencies are, and the theoretical
probabilities are also the same for everyone.
Interventions are localized in spacetime, but quantum
systems are pervasive. In each experiment, irrespective
of its history, there is only one quantum system, which
may consist of several particles or other subsystems, created
or annihilated at the various interventions. Note
that all these properties still hold if the measurement
outcome is the absence of a detector click. It does not
matter whether this is due to an imperfection of the detector
or to a probability less than 1 that a perfect detector
would be excited. The state of the quantum system
does not remain unchanged. It has to change to
respect unitarity. The mere presence of a detector that
could have been excited implies that there has been an
interaction between that detector and the quantum system.
Even if the detector has a finite probability of remaining
in its initial state, the quantum system correlated
to the latter acquires a different state (Dicke,
1981). The absence of a click, when there could have
been one, is also an event.
 
 
The measuring process involves not only the physical
system under study and a measuring apparatus (which
together form the composite system C) but also their
environment, which includes unspecified degrees of freedom
of the apparatus and the rest of the world. These
unknown degrees of freedom interact with the relevant
ones, but they are not under the control of the experimenter
and cannot be explicitly described. Our partial
ignorance is not a sign of weakness. It is fundamental. If
everything were known, acquisition of information
would be a meaningless concept.
 
 
A complete description of involves both macroscopic
and microscopic variables. The difference between
them is that the environment can be considered as
adequately isolated from the microscopic degrees of
freedom for the duration of the experiment and is not
influenced by them, while the environment is not isolated
from the macroscopic degrees of freedomFor example,
if there is a macroscopic pointer, air molecules bounce
from it in a way that depends on the position of that
pointer. Even if we can neglect the Brownian motion of
a massive pointer, its influence on the environment leads
to the phenomenon of decoherence, which is inherent to
the measuring process.
 
An essential property of the composite system C,
which is necessary to produce a meaningful measurement,
is that its states form a finite number of orthogonal
subspaces which are distinguishable by the observer.
 
[My comment #7: This is not the case for Aharonov's weak measurements where
 
<A>weak = <history|A|destiny>/<history|destiny>
 
Nor is it true when Alice's orthogonal micro-states are entangled with Bob's far away distinguishably non-orthogonal macro-quantum Glauber coherent and possibly squeezed states.
 
  1. Coherent states - Wikipedia, the free encyclopedia

    en.wikipedia.org/wiki/Coherent_states
     
    In physics, in quantum mechanics, a coherent state is the specific quantum state of the quantum harmonic oscillator whose dynamics most closely resembles the ...
    You've visited this page many times. Last visit: 8/7/13
  2. Review of Entangled Coherent States

    arxiv.org › quant-ph
    by BC Sanders - ‎2011 - ‎Cited by 6 - ‎Related articles
    Dec 8, 2011 - Abstract: We review entangled coherent state research since its first implicit use in 1967
 
|Alice,Bob> = (1/2)[|Alice +1>|Bob alpha> + |Alice -1>|Bob beta>]
 
<Alice+1|Alice -1> = 0
 
<Bob alpha|Bob beta> =/= 0  
 
 
e.g. Partial trace over Bob's states  |<Alice +1|Alice-Bob>|^2 = (1/2)[1 + |<Bob alpha|Bob beta>|^2] > 1
 
this is formally like a weak measurement where the usual Born probability rule breaks down. 
 
Complete isolation from environmental decoherence is assumed here.
 
It is clear violation of "passion at a distance" no-entanglement signaling arguments based on axioms that are empirically false in my opinion.
 
"The statistics of Bob’s result are not affected at all by what Alice may simultaneously do somewhere else. " (Peres) 
 
is false.
 
While a logically correct formal proof is desirable in physics, Nature has ways of leap frogging over their premises.
 
One can have constrained pre and post-selected conditional probabilities that are greater than 1, negative and even complex numbers. 
 
All of which correspond to observable effects in the laboratory - see Aephraim Steinberg's experimental papers
University of Toronto.]
 
Each macroscopically distinguishable subspace corresponds
to one of the outcomes of the intervention and
defines a POVM element Em , given explicitly by Eq. (8)
below. …
 
C. Decoherence
Up to now, quantum evolution is well defined and it is
in principle reversible. It would remain so if the environment
could be perfectly isolated from the macroscopic
degrees of freedom of the apparatus. This demand is of
course self-contradictory, since we have to read the result
of the measurement if we wish to make any use of it.
 
A detailed analysis of the interaction with the environment,
together with plausible hypotheses (Peres, 2000a),
shows that states of the environment that are correlated
with subspaces of with different labels m can be treated
as if they were orthogonal. This is an excellent approximation
(physics is not an exact science, it is a science of
approximations). The resulting theoretical predictions
will almost always be correct, and if any rare small deviation
from them is ever observed, it will be considered
as a statistical quirk or an experimental error.
 
The density matrix of the quantum system is thus effectively
block diagonal, and all our statistical predictions
are identical to those obtained for an ordinary mixture
of (unnormalized) pure states
.
 
This process is called decoherence. Each subspace
m is stable under decoherence—it is their relative
phase that decoheres. From this moment on, the macroscopic
degrees of freedom of have entered into the
classical domain. We can safely observe them and ‘‘lay
on them our grubby hands’’ (Caves, 1982). In particular,
they can be used to trigger amplification mechanisms
(the so-called detector clicks) for the convenience of the
experimenter.
 
Some authors claim that decoherence may provide a
solution of the ‘‘measurement problem,’’ with the particular
meaning that they attribute to that problem
(Zurek, 1991). Others dispute this point of view in their
comments on the above article (Zurek, 1993). A reassessment
of this issue and many important technical details
were recently published by Zurek (2002, 2003). Yet
decoherence has an essential role, as explained above. It
is essential that we distinguish decoherence, which results
from the disturbance of the environment by the
apparatus (and is a quantum effect), from noise, which
would result from the disturbance of the system or the
apparatus by the environment and would cause errors.
Noise is a mundane classical phenomenon, which we ignore
in this review.
 
E. The no-communication theorem
We now derive a sufficient condition that no instantaneous
information transfer can result from a distant intervention.
We shall show that the condition is
 
[Am,Bnn] = 0
 
where Amand Bnare Kraus matrices for the observation
of outcomes m by Alice and n by Bob.
 
[My comment #8: "The most beautiful theory is murdered by an ugly fact." - Feynman
e.g. Libet-Radin-Bierman presponse in living brain data
SRI CIA vetted reports of remote viewing by living brains.
 
  1. CIA-Initiated Remote Viewing At Stanford Research Institute

    www.biomindsuperpowers.com/Pages/CIA-InitiatedRV.html
     
    As if to add insult to injury, he then went on to "remote view" the interior of the apparatus, .... Figure 6 - Left to right: Christopher Green, Pat Price, and Hal Puthoff.
    You've visited this page many times. Last visit: 5/30/13
  2. Harold E. Puthoff - Wikipedia, the free encyclopedia

    en.wikipedia.org/wiki/Harold_E._Puthoff
     
    PuthoffHal, Success Story, Scientology Advanced Org Los Angeles (AOLA) special... H. E. Puthoff, CIA-Initiated Remote Viewing At Stanford Research Institute, ...
  3. Remote viewing - Wikipedia, the free encyclopedia

    en.wikipedia.org/wiki/Remote_viewing
     
    Among some of the ideas that Puthoff supported regarding remote viewing was the ...by Russell Targ and Hal Puthoff at Stanford Research Institute in the 1970s  ...
    You've visited this page many times. Last visit: 7/5/13
  4. Dr. Harold Puthoff on Remote Viewing - YouTube

    www.youtube.com/watch?v=FOAfH1utUSM
    Apr 28, 2011 - Uploaded by corazondelsur
    Dr. Hal Puthoff is considered the father of the US government'sRemote Viewing program, which reportedly ...
     
  5. Remoteviewed.com - Hal Puthoff

    www.remoteviewed.com/remote_viewing_halputhoff.htm
     
    Dr. Harold E. Puthoff is Director of the Institute for Advanced Studies at Austin. A theoretical and experimental physicist specializing in fundamental ...
 
 
On Sep 4, 2013, at 9:06 AM, JACK SARFATTI <This email address is being protected from spambots. You need JavaScript enabled to view it.> wrote:
 
Peres here is only talking about Von Neumann's strong measurements not 
Aharonov's weak measurements.

Standard texbooks on quantum mechanics
tell you that observable quantities are represented by
Hermitian operators, that their possible values are the
eigenvalues of these operators, and that the probability
of detecting eigenvalue a, corresponding to eigenvector
|a>  |<a|psi>|2, where |psi> is the (pure) state of the
quantum system that is observed. With a bit more sophistication
to include mixed states, the probability can
be written in a general way <a|rho|a> …
 
This is nice and neat, but it does not describe what
happens in real lifeQuantum phenomena do not occur
in Hilbert space; they occur in a laboratory. If you visit a
real laboratory, you will never find Hermitian operators
there. All you can see are emitters (lasers, ion guns, synchrotrons,
and the like) and appropriate detectors. In
the latter, the time required for the irreversible act of
amplification (the formation of a microscopic bubble in
a bubble chamber, or the initial stage of an electric discharge)
is extremely brief, typically of the order of an
atomic radius divided by the velocity of light. Once irreversibility
has set in, the rest of the amplification process
is essentially classical. It is noteworthy that the time and
space needed for initiating the irreversible processes are
incomparably smaller than the macroscopic resolution
of the detecting equipment.
 
The experimenter controls the emission process and
observes detection events. The theorist’s problem is to
predict the probability of response of this or that detector,
for a given emission procedure. It often happens
that the preparation is unknown to the experimenter,
and then the theory can be used for discriminating between
different preparation hypotheses, once the detection
outcomes are known.
 
<Screen Shot 2013-09-04 at 8.57.50 AM.png>
 
Many physicists, perhaps a majority, have an intuitive,
realistic worldview and consider a quantum state as a
physical entity. Its value may not be known, but in principle
the quantum state of a physical system would be
well defined. However, there is no experimental evidence
whatsoever to support this naive belief. On the
contrary, if this view is taken seriously, it may lead to
bizarre consequences, called ‘‘quantum paradoxes.’’
These so-called paradoxes originate solely from an incorrect
interpretation of quantum theory, which is thoroughly
pragmatic and, when correctly used, never yields
two contradictory answers to a well-posed question. It is
only the misuse of quantum concepts, guided by a pseudorealistic
philosophy, that leads to paradoxical results.
 
[My comment #2: Here is the basic conflict between epistemological vs ontological views of quantum reality.]
 
In this review we shall adhere to the view that r is
only a mathematical expression which encodes information
about the potential results of our experimental interventions.
The latter are commonly called
‘‘measurements’’—an unfortunate terminology, which
gives the impression that there exists in the real world
some unknown property that we are measuring. Even
the very existence of particles depends on the context of
our experiments. In a classic article, Mott (1929) wrote
‘‘Until the final interpretation is made, no mention
should be made of the a ray being a particle at all.’’
Drell (1978a, 1978b) provocatively asked ‘‘When is a
particle?’’ In particular, observers whose world lines are
accelerated record different numbers of particles, as will
be explained in Sec. V.D (Unruh, 1976; Wald, 1994).
 
 
1The theory of relativity did not cause as much misunderstanding
and controversy as quantum theory, because people
were careful to avoid using the same nomenclature as in nonrelativistic
physics. For example, elementary textbooks on
relativity theory distinguish ‘‘rest mass’’ from ‘‘relativistic
mass’’ (hard-core relativists call them simply ‘‘mass’’ and ‘‘energy’’).
2The ‘‘irreversible act of amplification’’ is part of quantum
folklore, but it is not essential to physics. Amplification is
needed solely to facilitate the work of the experimenter.
3Positive operators are those having the property that
^curuc&>0 for any state c. These operators are always Hermitian.
94 A. Peres and D. R. Terno: Quantum information and relativity theory
Rev. Mod.
 
 
 
On Sep 4, 2013, at 8:48 AM, JACK SARFATTI <This email address is being protected from spambots. You need JavaScript enabled to view it.> wrote:



Begin forwarded message:

From: JACK SARFATTI <This email address is being protected from spambots. You need JavaScript enabled to view it.>
Subject: Quantum information and relativity theory
Date: September 4, 2013 8:33:48 AM PDT
To: nick herbert <This email address is being protected from spambots. You need JavaScript enabled to view it.>
 

The late Asher Peres http://en.wikipedia.org/wiki/Asher_Peres interpretation is the antithesis of the late David Bohm's ontological interpretation http://en.wikipedia.org/wiki/David_Bohm holding to a purely subjective epistemological Bohrian interpretation of the quantum BIT potential Q.
He claims that Antony Valentini's signal non locality beyond orthodox quantum theory would violate the Second Law of Thermodynamics.
 
REVIEWS OF MODERN PHYSICS, VOLUME 76, JANUARY 2004
Quantum information and relativity theory
Asher Peres
Department of Physics, Technion–Israel Institute of Technology, 32000 Haifa, Israel
Daniel R. Terno
Perimeter Institute for Theoretical Physics, Waterloo, Ontario, Canada N2J 2W9
(Published 6 January 2004)
This article discusses the intimate relationship between quantum mechanics, information theory, and
relativity theory. Taken together these are the foundations of present-day theoretical physics, and
their interrelationship is an essential part of the theory. The acquisition of information from a
quantum system by an observer occurs at the interface of classical and quantum physics. The authors
review the essential tools needed to describe this interface, i.e., Kraus matrices and
positive-operator-valued measures. They then discuss how special relativity imposes severe
restrictions on the transfer of information between distant systems and the implications of the fact that
quantum entropy is not a Lorentz-covariant concept. This leads to a discussion of how it comes about
that Lorentz transformations of reduced density matrices for entangled systems may not be
completely positive maps. Quantum field theory is, of course, necessary for a consistent description of
interactions. Its structure implies a fundamental tradeoff between detector reliability and
localizability. Moreover, general relativity produces new and counterintuitive effects, particularly
when black holes (or, more generally, event horizons) are involved. In this more general context the
authors discuss how most of the current concepts in quantum information theory may require a
reassessment.
CONTENTS
I. Three Inseparable Theories 93
A. Relativity and information 93
B. Quantum mechanics and information 94
C. Relativity and quantum theory 95
D. The meaning of probability 95
E. The role of topology 96
F. The essence of quantum information 96
II. The Acquisition of Information 97
A. The ambivalent quantum observer 97
B. The measuring process 98
C. Decoherence 99
D. Kraus matrices and positive-operator-valued
measures (POVM’s) 99
E. The no-communication theorem 100
III. The Relativistic Measuring Process 102
A. General properties 102
B. The role of relativity 103
C. Quantum nonlocality? 104
D. Classical analogies 105
IV. Quantum Entropy and Special Relativity 105
A. Reduced density matrices 105
B. Massive particles 105
C. Photons 107
D. Entanglement 109
E. Communication channels 110
V. The Role of Quantum Field Theory 110
A. General theorems 110
B. Particles and localization 111
C. Entanglement in quantum field theory 112
D. Accelerated detectors 113
VI. Beyond Special Relativity 114
A. Entanglement revisited 115
B. The thermodynamics of black holes 116
C. Open problems 118
Acknowledgments and Apologies 118
Appendix A: Relativistic State Transformations 119
Appendix B: Black-Hole Radiation 119
References 120
I. THREE INSEPARABLE THEORIES
Quantum theory and relativity theory emerged at the
beginning of the twentieth century to give answers to
unexplained issues in physics: the blackbody spectrum,
the structure of atoms and nuclei, the electrodynamics of
moving bodies. Many years later, information theory
was developed by Claude Shannon (1948) for analyzing
the efficiency of communication methods. How do these
seemingly disparate disciplines relate to each other? In
this review, we shall show that they are inseparably
linked.
A. Relativity and information
Common presentations of relativity theory employ
fictitious observers who send and receive signals. These
‘‘observers’’ should not be thought of as human beings,
but rather as ordinary physical emitters and detectors.
Their role is to label and locate events in spacetime. The
speed of transmission of these signals is bounded by
c—the velocity of light—because information needs a
material carrier, and the latter must obey the laws of
physics. Information is physical (Landauer, 1991).
 
[My comment #1: Indeed information is physical. Contrary to Peres, in Bohm's theory Q is also physical but not material (be able), consequently one can have entanglement negentropy transfer without be able material propagation of a classical signal. I think Peres makes a fundamental error here.]
 
However, the mere existence of an upper bound on
the speed of propagation of physical effects does not do
justice to the fundamentally new concepts that were introduced
by Albert Einstein (one could as well imagine
communications limited by the speed of sound, or that
of the postal service). Einstein showed that simultaneity
had no absolute meaning, and that distant events might
have different time orderings when referred to observers
in relative motion. Relativistic kinematics is all about
information transfer between observers in relative motion.
 
Classical information theory involves concepts such as
the rates of emission and detection of signals, and the
noise power spectrum. These variables have well defined
relativistic transformation properties, independent
of the actual physical implementation of the communication
system.