Зарегистрироваться
Восстановить пароль
FAQ по входу

Caticha A. Entropic Inference and the Foundations of Physics

  • Файл формата pdf
  • размером 1,10 МБ
Caticha A. Entropic Inference and the Foundations of Physics
N.-Y.: International Society for Bayesian Analysis, 2012. - 293p.
Science consists in using information about the world for the purpose of predicting, explaining, understanding, and/or controlling phenomena of interest.
The basic dfficulty is that the available information is usually insufficient to attain any of those goals with certainty. A central concern in these lectures will be the problem of inductive inference, that is, the problem of reasoning under conditions of incomplete information.
Our goal is twofold. First, to develop the main tools for inference - probability and entropy - and to demonstrate their use. And second, to demonstrate their importance for physics. More specifically our goal is to clarify the conceptual foundations of physics by deriving the fundamental laws of statistical mechanics and of quantum mechanics as examples of inductive inference. Perhaps all physics can be derived in this way.
Contents:

Foreword
Preface
Inductive Inference and Physics
Probability
Designing a framework for inductive inference
Entropic Physics
Probability
The design of probability theory
Rational beliefs?
Quantifying rational belief
The sum rule
The associativity constraint
The general solution and its regraduation
The general sum rule
Cox's proof
The product rule
From four arguments down to two
The distributivity constraint
Some remarks on the sum and product rules
On meaning, ignorance and randomness
Independent and mutually exclusive events
Marginalization
The expected value
The binomial distribution
Probability vs frequency: the law of large numbers
The Gaussian distribution
The de Moivre-Laplace theorem
The Central Limit Theorem
Updating probabilities: Bayes' rule
Formulating the problem
Minimal updating: Bayes' rule
Multiple experiments, sequential updating
Remarks on priors
Hypothesis testing and con rmation
Examples from data analysis
Parameter estimation
Curve tting
Model selection
Maximum Likelihood
Entropy I: The Evolution of Carnot's Principle
Carnot: reversible engines
Kelvin: temperature
Clausius: entropy
Maxwell: probability
Gibbs: beyond heat
Boltzmann: entropy and probability
Some remarks
Entropy II: Measuring Information
Shannon's information measure
Relative entropy
Joint entropy, additivity, and subadditivity
Conditional entropy and mutual information
Continuous distributions
Experimental design
Communication Theory
Assigning probabilities: MaxEnt
Canonical distributions
On constraints and relevant information
Avoiding pitfalls { I
MaxEnt cannot x
awed information
MaxEnt cannot supply missing information
Sample averages are not expected values
Statistical Mechanics
Liouville's theorem
Derivation of Equal a Priori Probabilities
The relevant constraints
The canonical formalism
Equilibrium with a heat bath of nite size
The Second Law of Thermodynamics
The thermodynamic limit
Interpretation of the Second Law: Reproducibility
Remarks on irreversibility
Entropies, descriptions and the Gibbs paradox
Entropy III: Updating Probabilities
What is information?
The design of entropic inference
General criteria
Entropy as a tool for updating probabilities
Speci c design criteria
The ME method
The proofs
An alternative independence criterion: consistency
Random remarks
On priors
Comments on other axiomatizations
Bayes' rule as a special case of ME
Commuting and non-commuting constraints
Conclusion
Information Geometry
Examples of statistical manifolds
Vectors in curved spaces
Distance and volume in curved spaces
Derivations of the information metric
Derivation from distinguishability
Derivation from a Euclidean metric
Derivation from asymptotic inference
Derivation from relative entropy
Uniqueness of the information metric
The metric for some common distributions
Entropy IV: Entropic Inference
Deviations from maximum entropy
The ME method
An application to
uctuations
Avoiding pitfalls { II
The three-sided die
Understanding ignorance
Entropic Dynamics: Time and Quantum Theory
The statistical model
Entropic dynamics
Entropic time
Time as a sequence of instants
Duration: a convenient time scale
The directionality of entropic time
Accumulating changes
Derivation of the Fokker-Planck equation
The current and osmotic velocities
Non-dissipative di usion
Manifold dynamics
Classical limits
The Schrodinger equation
A quantum equivalence principle
Entropic time vs physical time
Dynamics in an external electromagnetic eld
An additional constraint
Entropic dynamics
Gauge invariance
Is ED a hidden-variable model?
Summary and Conclusions
Topics in Quantum Theory
The quantum measurement problem
Observables other than position
Ampli cation
But isn't the measuring device a quantum system too?
Momentum in Entropic Dynamics
Expected values
Uncertainty relations
Discussion
An aside: the hybrid  = theory
Conclusions
References
  • Чтобы скачать этот файл зарегистрируйтесь и/или войдите на сайт используя форму сверху.
  • Регистрация