|
Problemy Peredachi Informatsii, 2014, Volume 50, Issue 3, Pages 3–18
(Mi ppi2141)
|
|
|
|
This article is cited in 3 scientific papers (total in 3 papers)
Information Theory
On one extreme value problem for entropy and error probability
V. V. Prelov Kharkevich Institute for Information Transmission Problems, Russian Academy of Sciences, Moscow, Russia
Abstract:
The problem of determining both the maximum and minimum entropy of a random variable $Y$ as well as the maximum absolute value of the difference between entropies of $Y$ and another random variable $X$ is considered under the condition that the probability distribution of $X$ is fixed and the error probability (i.e., the probability of noncoincidence of random values of $X$ and $Y$) is given. A precise expression for the minimum entropy of $Y$ is found. Some conditions under which the entropy of $Y$ takes its maximum value are pointed out. In other cases, some lower and upper bounds are obtained for the maximum entropy of $Y$ as well as for the maximum absolute value of the difference between entropies of $Y$ and $X$.
Received: 21.01.2014 Revised: 19.06.2014
Citation:
V. V. Prelov, “On one extreme value problem for entropy and error probability”, Probl. Peredachi Inf., 50:3 (2014), 3–18; Problems Inform. Transmission, 50:3 (2014), 203–216
Linking options:
https://www.mathnet.ru/eng/ppi2141 https://www.mathnet.ru/eng/ppi/v50/i3/p3
|
Statistics & downloads: |
Abstract page: | 350 | Full-text PDF : | 82 | References: | 54 | First page: | 36 |
|