Measure, Probability and Quantum
John Skilling (Maximum Entropy Data Consultants)
Friday 29th September, 2017 15:00-16:00 Seminar room 311B
In mathematics, addition tends to be assumed as basic. Yet summation is in fact required by symmetries of combination. We get
Quantity(X) = scalar obeying standard sum rule.
Combination is necessarily addition "+". That's basic measure theory.
In inference, we investigate partitioning as the reverse of combination. Symmetries apply here too, and this gives us
Pr(X | Y) = scalar obeying standard sum and product rules of proportion.
Partitioning is necessarily division "÷". That's basic probability calculus. Meanwhile, independence is necessarily direct product "x".
In information, the symmetries of independent systems give us the entropy (Kullback-Leibler) formula.
Divergence of distribution P from Q = \sum p_i \log(p_i/q_i)
That's basic information theory. Distributions do not form a metric space because the only connection is asymmetric, whereas a distance would have to be symmetric.
In basic physics, small objects are necessarily disturbed by observation, because the observing device can itself be no smaller. As we approach that situation, we need a calculus of interactions between at least two objects, so we need a pair-valued description, not just scalars. We still get sum and product rules, with pairs behaving as complex numbers. Physicists call these the ``Feynman rules'', and the descriptive pairs $\psi$ relate to observation through their ``Born rule'' probabilities p = |\psi|^2. That's basic quantum theory.
There is no mystery. Quantum theory is just a necessary consequence of the same elementary symmetries that already gave us measure theory and probability theory. Statistics and physics share a common rational foundation. Foundations matter because sympathetic understanding diverts educated users away from false trails.
Clarity and simplicity => generality and power