|
This article is cited in 37 scientific papers (total in 37 papers)
METHODOLOGICAL NOTES
Entropy and information of open systems
Yu. L. Klimontovich Lomonosov Moscow State University, Faculty of Physics
Abstract:
Of the two definitions of 'information' given by Shannon and employed in the communication theory, one is identical to that of Boltzmann's entropy and gives in fact a measure of statistical uncertainty. The other involves the difference of unconditional and conditional entropies and, if properly specified, allows the introduction of a measure of information for an open system depending on the values of the system's control parameters. Two classes of systems are identified. For those in the first class, an equilibrium state is possible and the law of conversation of information and entropy holds. When at equilibrium, such systems have zero information and maximum entropy. In self-organization processes, information increases away from the equilibrium state. For the systems of the other class, the equilibrium state is impossible. For these, the so-called 'chaoticity norm' is introduced and also two kinds of self-organization processes are considered and the concept of information is appropriately defined. Common information definitions are applied to classical and quantum physical systems as well as to medical and biological systems.
Received: November 30, 1998
Citation:
Yu. L. Klimontovich, “Entropy and information of open systems”, UFN, 169:4 (1999), 443–452; Phys. Usp., 42:4 (1999), 375–384
Linking options:
https://www.mathnet.ru/eng/ufn1597 https://www.mathnet.ru/eng/ufn/v169/i4/p443
|
Statistics & downloads: |
Abstract page: | 570 | Full-text PDF : | 261 | References: | 86 | First page: | 1 |
|