Problemy Peredachi Informatsii
RUS  ENG    JOURNALS   PEOPLE   ORGANISATIONS   CONFERENCES   SEMINARS   VIDEO LIBRARY   PACKAGE AMSBIB  
General information
Latest issue
Archive
Impact factor
Guidelines for authors

Search papers
Search references

RSS
Latest issue
Current issues
Archive issues
What is RSS



Probl. Peredachi Inf.:
Year:
Volume:
Issue:
Page:
Find






Personal entry:
Login:
Password:
Save password
Enter
Forgotten password?
Register


Problemy Peredachi Informatsii, 2020, Volume 56, Issue 2, Pages 3–63
DOI: https://doi.org/10.31857/S0555292320020011
(Mi ppi2315)
 

This article is cited in 13 scientific papers (total in 13 papers)

Information Theory

Comparison of contraction coefficients for $f$-divergences

A. Makur, L. Zheng

Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology, Cambridge, MA, USA
References:
Abstract: Contraction coefficients are distribution dependent constants that are used to sharpen standard data processing inequalities for $f$-divergences (or relative $f$-entropies) and produce so-called “strong” data processing inequalities. For any bivariate joint distribution, i.e., any probability vector and stochastic matrix pair, it is known that contraction coefficients for $f$-divergences are upper bounded by unity and lower bounded by the contraction coefficient for $\chi^2$-divergence. In this paper, we elucidate that the upper bound is achieved when the joint distribution is decomposable, and the lower bound can be achieved by driving the input $f$-divergences of the contraction coefficients to zero. Then, we establish a linear upper bound on the contraction coefficients of joint distributions for a certain class of $f$-divergences using the contraction coefficient for $\chi^2$-divergence, and refine this upper bound for the salient special case of Kullback–Leibler (KL) divergence. Furthermore, we present an alternative proof of the fact that the contraction coefficients for KL and $\chi^2$-divergences are equal for bivariate Gaussian distributions (where the former coefficient may impose a bounded second moment constraint). Finally, we generalize the well-known result that contraction coefficients of stochastic matrices (after extremizing over all possible probability vectors) for all nonlinear operator convex $f$-divergences are equal. In particular, we prove that the so-called “less noisy” preorder over stochastic matrices can be equivalently characterized by any nonlinear operator convex $f$-divergence. As an application of this characterization, we also derive a generalization of Samorodnitsky's strong data processing inequality.
Keywords: contraction coefficient, $f$-divergence/relative $f$-entropy, strong data processing inequality, less noisy preorder, maximal correlation.
Funding agency Grant number
National Science Foundation 1216476
This work was funded by the National Science Foundation (NSF) under Award 1216476 and the Hewlett-Packard Fellowship.
Received: 17.10.2019
Revised: 17.10.2019
Accepted: 09.03.2020
English version:
Problems of Information Transmission, 2020, Volume 56, Issue 2, Pages 103–156
DOI: https://doi.org/10.1134/S0032946020020015
Bibliographic databases:
Document Type: Article
UDC: 621.391.1 : 519.72
Language: Russian
Citation: A. Makur, L. Zheng, “Comparison of contraction coefficients for $f$-divergences”, Probl. Peredachi Inf., 56:2 (2020), 3–63; Problems Inform. Transmission, 56:2 (2020), 103–156
Citation in format AMSBIB
\Bibitem{MakZhe20}
\by A.~Makur, L.~Zheng
\paper Comparison of contraction coefficients for $f$-divergences
\jour Probl. Peredachi Inf.
\yr 2020
\vol 56
\issue 2
\pages 3--63
\mathnet{http://mi.mathnet.ru/ppi2315}
\crossref{https://doi.org/10.31857/S0555292320020011}
\elib{https://elibrary.ru/item.asp?id=45448027}
\transl
\jour Problems Inform. Transmission
\yr 2020
\vol 56
\issue 2
\pages 103--156
\crossref{https://doi.org/10.1134/S0032946020020015}
\isi{https://gateway.webofknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcApp=Publons&SrcAuth=Publons_CEL&DestLinkType=FullRecord&DestApp=WOS_CPL&KeyUT=000548217800001}
\scopus{https://www.scopus.com/record/display.url?origin=inward&eid=2-s2.0-85087995755}
Linking options:
  • https://www.mathnet.ru/eng/ppi2315
  • https://www.mathnet.ru/eng/ppi/v56/i2/p3
  • This publication is cited in the following 13 articles:
    Citing articles in Google Scholar: Russian citations, English citations
    Related articles in Google Scholar: Russian articles, English articles
    Ïðîáëåìû ïåðåäà÷è èíôîðìàöèè Problems of Information Transmission
    Statistics & downloads:
    Abstract page:213
    Full-text PDF :17
    References:22
    First page:9
     
      Contact us:
     Terms of Use  Registration to the website  Logotypes © Steklov Mathematical Institute RAS, 2024