Alladi Ramakrishnan Hall
Frequentist-approach inspired theory of quantum random phenomena predicts signaling
C. S. Sudheer Kumar
IISER--Pune
Consider the following problems: Why cannot we distinguish
between x and z ensembles, each described by the same density matrix,
even though they are physically different (arXiv:1811.05472
[quant-ph])? Why do Gibbs-von Neumann entropy (which uses density
matrix description) gives wrong prediction in closed non-equilibrium
systems (arXiv:1903.11870 [cond-mat.stat-mech])? etc.
Root cause of all these problems is the a priori assumption of
existence of a probability measure (a purely mathematical quantity
unjustifiable physically), on which density matrix description rests.
But such a probability measure does not exist physically because of no
pointwise convergence of limiting relative frequency to the
theoretically assumed constant value for the probability of a random
event. Hence such an a priori assumption of a probability measure
(even though appealing intuitively) is incorrect physically. Moreover
such an a priori assumption of a probability measure is not really
necessary, even though useful for many practical purposes as it
simplifies the calculations. Hence Ockham's thesis motivates us to
drop the a priori assumption of a probability measure completely. We
consider path by path, everything is pointwise (e.g., fluctuation in
itself i.e., the actual fluctuation), no a priori probabilistic
measures (like standard deviation which is an averaged measure of
fluctuation) (arXiv: 1903.12096 [quant-ph]). Consequently we see that
all problems are fixed i.e., we can distinguish between x and z
ensembles (this leads to signaling via entangled particles), Boltzmann
entropy gives correct prediction etc.
I will also talk briefly about, going beyond Tsirelson bound and
quantum cryptography; NMR investigation of contextuality, Luders and
von Neumann measuring devices, and amplification of quantum Fisher
information via pre-correlated ancillas.
Done