|
Problemy Peredachi Informatsii, 2017, Volume 53, Issue 3, Pages 16–22
(Mi ppi2239)
|
|
|
|
This article is cited in 3 scientific papers (total in 3 papers)
Information Theory
On coupling of probability distributions and estimating the divergence through variation
V. V. Prelov Kharkevich Institute for Information Transmission Problems,
Russian Academy of Sciences, Moscow, Russia
Abstract:
Let $X$ be a discrete random variable with a given probability distribution. For any $\alpha$, $0\le\alpha\le1$, we obtain precise values for both the maximum and minimum variational distance between $X$ and another random variable $Y$ under which an $\alpha$-coupling of these random variables is possible. We also give the maximum and minimum values for couplings of $X$ and $Y$ provided that the variational distance between these random variables is fixed. As a consequence, we obtain a new lower bound on the divergence through variational distance.
Received: 22.11.2016 Revised: 10.02.2017
Citation:
V. V. Prelov, “On coupling of probability distributions and estimating the divergence through variation”, Probl. Peredachi Inf., 53:3 (2017), 16–22; Problems Inform. Transmission, 53:3 (2017), 215–221
Linking options:
https://www.mathnet.ru/eng/ppi2239 https://www.mathnet.ru/eng/ppi/v53/i3/p16
|
Statistics & downloads: |
Abstract page: | 282 | Full-text PDF : | 44 | References: | 42 | First page: | 5 |
|