ON THE NOTION OF DISTANCE REPRESENTING INFORMATION CLOSENESS: Possibility and Probability Distributions
A metric distance based on information variation a derived in this paper for possibility distributions (function G defined by (6), where g is defined by (2) and U is defined by (1)). It is applicable to any pair of normalized possibility distributions defined on a finite set X and either it is uniqu...
Saved in:
| Published in | International journal of general systems Vol. 9; no. 2; pp. 103 - 115 |
|---|---|
| Main Authors | , |
| Format | Journal Article |
| Language | English |
| Published |
Taylor & Francis Group
01.01.1983
|
| Subjects | |
| Online Access | Get full text |
| ISSN | 0308-1079 1563-5104 |
| DOI | 10.1080/03081078308960805 |
Cover
| Summary: | A metric distance based on information variation a derived in this paper for possibility distributions (function G defined by (6), where g is defined by (2) and U is defined by (1)). It is applicable to any pair of normalized possibility distributions defined on a finite set X and either it is unique or, if not unique, it represents the maximum distance in the class of such metric distances. It is an open problem lo either derive (his class or to prove the uniqueness of our distance.
A similar measure; based on the well-known directed information divergence, is proposed for probability distributions defined on a finite set (function D defined by (12), where d is defined by (11)). The measure is nondegenerate and symmetric and it is our conjecture, supported by empirical evidence, that it also satisfies the triangle inequality requirement of metric distances, A mathematical proof of this conjecture remains an open problem. |
|---|---|
| ISSN: | 0308-1079 1563-5104 |
| DOI: | 10.1080/03081078308960805 |