Relative Entropy and Statistics
Bavaud F. (2009) Information Theory, Relative Entropy and Statistics. In: Sommaruga G. (editor): Formal Theories of Information. Lecture Notes in Computer Science 5363, Springer, pp. 54-78 Formalising the confrontation of opinions (models) to observations (data) is the task of Inferential Statistics...
Saved in:
| Main Author | |
|---|---|
| Format | Journal Article |
| Language | English |
| Published |
29.08.2008
|
| Subjects | |
| Online Access | Get full text |
| DOI | 10.48550/arxiv.0808.4111 |
Cover
| Summary: | Bavaud F. (2009) Information Theory, Relative Entropy and
Statistics. In: Sommaruga G. (editor): Formal Theories of Information.
Lecture Notes in Computer Science 5363, Springer, pp. 54-78 Formalising the confrontation of opinions (models) to observations (data) is
the task of Inferential Statistics. Information Theory provides us with a basic
functional, the relative entropy (or Kullback-Leibler divergence), an
asymmetrical measure of dissimilarity between the empirical and the theoretical
distributions. The formal properties of the relative entropy turn out to be
able to capture every aspect of Inferential Statistics, as illustrated here,
for simplicity, on dices (= i.i.d. process with finitely many outcomes):
refutability (strict or probabilistic): the asymmetry data / models; small
deviations: rejecting a single hypothesis; competition between hypotheses and
model selection; maximum likelihood: model inference and its limits; maximum
entropy: reconstructing partially observed data; EM-algorithm; flow data and
gravity modelling; determining the order of a Markov chain. |
|---|---|
| DOI: | 10.48550/arxiv.0808.4111 |