|
Problemy Peredachi Informatsii, 1994, Volume 30, Issue 4, Pages 3–11
(Mi ppi249)
|
|
|
|
This article is cited in 4 scientific papers (total in 4 papers)
Information Theory
Information Rates in Stationary Gaussian Channels in Weak Signal Transmission
M. S. Pinsker, V. V. Prelov
Abstract:
Let $N=\{N_i\}$ and $Z=\{Z_i\}$ be arbitrary independent discrete-time stationary processes. If $N$ is a regular Gaussian process and $Z$ is a process with completely positive entropy, we prove that the information rate $\bar{I}(Z;N+\theta Z)=\bar{I}(\bar{Z};N+\theta\bar{Z})+o(\theta^2)$, $\theta\to 0$, where $\bar{Z}=\{\bar{Z}_i\}$ is a Gaussian stationary process with the same autocorrelation function as $Z$. As a corollary, some generalizations of the results of [1, 2] concerning the sensitivities of the channel capacity and the $\varepsilon$-entropy are obtained, which allow one to omit the regularity assumption of $Z$ (in the case of the $\varepsilon$-entropy we can also omit the assumption of regularity of $N$ and remove all previous conditions on the spectral densities of $N$ and $Z$).
Received: 19.04.1994
Citation:
M. S. Pinsker, V. V. Prelov, “Information Rates in Stationary Gaussian Channels in Weak Signal Transmission”, Probl. Peredachi Inf., 30:4 (1994), 3–11; Problems Inform. Transmission, 30:4 (1994), 291–298
Linking options:
https://www.mathnet.ru/eng/ppi249 https://www.mathnet.ru/eng/ppi/v30/i4/p3
|
Statistics & downloads: |
Abstract page: | 318 | Full-text PDF : | 103 |
|