|
Problemy Peredachi Informatsii, 2016, Volume 52, Issue 4, Pages 3–13
(Mi ppi2218)
|
|
|
|
This article is cited in 3 scientific papers (total in 3 papers)
Information Theory
On some extremal problems for mutual information and entropy
V. V. Prelov Kharkevich Institute for Information Transmission Problems,
Russian Academy of Sciences, Moscow, Russia
Abstract:
The problem of determining the maximum mutual information $I(X;Y)$ and minimum entropy $H(X,Y)$ of a pair of discrete random variables $X$ and $Y$ is considered under the condition that the probability distribution of $X$ is fixed and the error probability $\mathrm{Pr}\{Y\ne X\}$ takes a given value $\varepsilon$, $0\le\varepsilon\le1$. Precise values for these quantities are found, which in several cases allows us to obtain explicit formulas for both the maximum information and minimum entropy in terms of the probability distribution of $X$ and the parameter $\varepsilon$.
Received: 01.12.2015 Revised: 14.10.2016
Citation:
V. V. Prelov, “On some extremal problems for mutual information and entropy”, Probl. Peredachi Inf., 52:4 (2016), 3–13; Problems Inform. Transmission, 52:4 (2016), 319–328
Linking options:
https://www.mathnet.ru/eng/ppi2218 https://www.mathnet.ru/eng/ppi/v52/i4/p3
|
Statistics & downloads: |
Abstract page: | 299 | Full-text PDF : | 48 | References: | 39 | First page: | 17 |
|