|
Problemy Peredachi Informatsii, 2010, Volume 46, Issue 2, Pages 24–29
(Mi ppi2013)
|
|
|
|
This article is cited in 4 scientific papers (total in 4 papers)
Information Theory
On computation of information via variation and inequalities for the entropy function
V. V. Prelov A. A. Kharkevich Institute for Information Transmission Problems, Russian Academy of Sciences
Abstract:
This paper supplements the author's paper [1]. We obtain an explicit formula which in a special case allows us to calculate the maximum of mutual information of several random variables via the variational distance between the joint distribution of these random variables and the product of their marginal distributions. We establish two new inequalities for the binary entropy function, which are related to the problem considered here.
Received: 26.01.2010
Citation:
V. V. Prelov, “On computation of information via variation and inequalities for the entropy function”, Probl. Peredachi Inf., 46:2 (2010), 24–29; Problems Inform. Transmission, 46:2 (2010), 122–126
Linking options:
https://www.mathnet.ru/eng/ppi2013 https://www.mathnet.ru/eng/ppi/v46/i2/p24
|
Statistics & downloads: |
Abstract page: | 334 | Full-text PDF : | 89 | References: | 58 | First page: | 7 |
|