|
This article is cited in 6 scientific papers (total in 6 papers)
Information Theory
Normalized Information-Based Divergences
J.-F. Coeurjollya, R. Drouilhet, J.-F. Robineau a Université Pierre Mendès France - Grenoble 2
Abstract:
This paper is devoted to the mathematical study of some divergences based on
mutual information which are well suited to categorical random vectors. These divergences are
generalizations of the “entropy distance” and “information distance.” Their main characteristic
is that they combine a complexity term and the mutual information. We then introduce the
notion of (normalized) information-based divergence, propose several examples, and discuss
their mathematical properties, in particular, in some prediction framework.
Received: 11.04.2006 Revised: 16.05.2007
Citation:
J.-F. Coeurjolly, R. Drouilhet, J.-F. Robineau, “Normalized Information-Based Divergences”, Probl. Peredachi Inf., 43:3 (2007), 3–27; Problems Inform. Transmission, 43:3 (2007), 167–189
Linking options:
https://www.mathnet.ru/eng/ppi15 https://www.mathnet.ru/eng/ppi/v43/i3/p3
|
|