|
Information theory studies the quantification, storage, and communication of information.
In information theory, systems are modeled by a transmitter, channel, and receiver. The transmitter produces messages that are sent through the channel. The channel modifies the message in some way. The receiver attempts to infer which message was sent. In this context, entropy (more specifically, Shannon entropy) is the expected value (average) of the information contained in each message. 'Messages' can be modeled by any flow of information. N. Wiener introduces the concepts, amount of information, entropy, feedback and background noise as essential characteristics of how the human brain functions. The notion of the amount of information attaches itself very naturally to a classical notion in statistical mechanics: that of entropy. Just as the amount of information in a system is a measure of its degree of organisation, so the entropy of a system is a measure of its degree of disorganisation. ================================ Стратонович Р.Л. Теория Информации. М., "Сов. радио", 1975, 424с. Систематическое изложение шенноновской теории информации, также новые результаты по теории ценности хартлиевского, больцмановского и шенноновского понятий количества информации. Подчеркивается общность математического аппарата теории информации и статистической термодинамики.
|