|
Problemy Peredachi Informatsii, 2009, Volume 45, Issue 4, Pages 3–17
(Mi ppi1995)
|
|
|
|
This article is cited in 5 scientific papers (total in 5 papers)
Information Theory
Mutual information of several random variables and its estimation via variation
V. V. Prelov A. A. Kharkevich Institute for Information Transmission Problems, Russian Academy of Sciences
Abstract:
We obtain some upper and lower bounds for the maximum of mutual information of several random variables via variational distance between the joint distribution of these random variables and the product of its marginal distributions. In this connection, some properties of variational distance between probability distributions of this type are derived. We show that in some special cases estimates of the maximum of mutual information obtained here are optimal or asymptotically optimal. Some results of this paper generalize the corresponding results of [1–3] to the multivariate case.
Received: 12.05.2009
Citation:
V. V. Prelov, “Mutual information of several random variables and its estimation via variation”, Probl. Peredachi Inf., 45:4 (2009), 3–17; Problems Inform. Transmission, 45:4 (2009), 295–308
Linking options:
https://www.mathnet.ru/eng/ppi1995 https://www.mathnet.ru/eng/ppi/v45/i4/p3
|
Statistics & downloads: |
Abstract page: | 752 | Full-text PDF : | 186 | References: | 74 | First page: | 34 |
|