|
Information Theory
Entropy and compression: a simple proof of an inequality of Khinchin–Ornstein–Shields
R. Aragonaa, F. Marzia, F. Mignosiab, M. Spezialettic a DISIM, University of L'Aquila, Coppito, L'Aquila, Italy
b ICAR-CNR, Palermo, Italy
c University of Naples “Federico II,” Napoli, Italy
Abstract:
This paper concerns the folklore statement that “entropy is a lower bound for compression.” More precisely, we derive from the entropy theorem a simple proof of a pointwise inequality first stated by Ornstein and Shields and which is the almost-sure version of an average inequality first stated by Khinchin in 1953. We further give an elementary proof of the original Khinchin inequality, which can be used as an exercise for information theory students, and we conclude by giving historical and technical notes of such inequality.
Keywords:
ergodic sources, entropy, lossless data compression, one-to-one code sequence, Shannon–McMillan theorem.
Received: 12.12.2019 Revised: 08.01.2020 Accepted: 15.01.2020
Citation:
R. Aragona, F. Marzi, F. Mignosi, M. Spezialetti, “Entropy and compression: a simple proof of an inequality of Khinchin–Ornstein–Shields”, Probl. Peredachi Inf., 56:1 (2020), 15–25; Problems Inform. Transmission, 56:1 (2020), 13–22
Linking options:
https://www.mathnet.ru/eng/ppi2308 https://www.mathnet.ru/eng/ppi/v56/i1/p15
|
Statistics & downloads: |
Abstract page: | 168 | Full-text PDF : | 35 | References: | 29 | First page: | 14 |
|