Relative Entropy and Statistics

Bavaud F. (2009) Information Theory, Relative Entropy and Statistics. In: Sommaruga G. (editor): Formal Theories of Information. Lecture Notes in Computer Science 5363, Springer, pp. 54-78 Formalising the confrontation of opinions (models) to observations (data) is the task of Inferential Statistics...

Full description

Saved in:
Bibliographic Details
Main Author Bavaud, François
Format Journal Article
LanguageEnglish
Published 29.08.2008
Subjects
Online AccessGet full text
DOI10.48550/arxiv.0808.4111

Cover

More Information
Summary:Bavaud F. (2009) Information Theory, Relative Entropy and Statistics. In: Sommaruga G. (editor): Formal Theories of Information. Lecture Notes in Computer Science 5363, Springer, pp. 54-78 Formalising the confrontation of opinions (models) to observations (data) is the task of Inferential Statistics. Information Theory provides us with a basic functional, the relative entropy (or Kullback-Leibler divergence), an asymmetrical measure of dissimilarity between the empirical and the theoretical distributions. The formal properties of the relative entropy turn out to be able to capture every aspect of Inferential Statistics, as illustrated here, for simplicity, on dices (= i.i.d. process with finitely many outcomes): refutability (strict or probabilistic): the asymmetry data / models; small deviations: rejecting a single hypothesis; competition between hypotheses and model selection; maximum likelihood: model inference and its limits; maximum entropy: reconstructing partially observed data; EM-algorithm; flow data and gravity modelling; determining the order of a Markov chain.
DOI:10.48550/arxiv.0808.4111