Information Theory

Information theory is a branch of the mathematical theory of Probability and mathematical Statistics, that deals with the concepts of information and information Entropy, communication systems, data transmission and rate distortion theory, cryptography, signal-to-noise ratios, data compression, and related topics. http://www.wikipedia.org/wiki/Information_theory

Claude Shannon

  • the information content, self-information, surprisal, or Shannon information is a basic quantity derived from the probability of a particular event occurring from a random variable. It can be thought of as an alternative way of expressing probability, much like odds or log-odds, but which has particular mathematical advantages in the setting of information theory. The Shannon information can be interpreted as quantifying the level of "surprise" of a particular outcome. https://en.wikipedia.org/wiki/Information_content

Gregory Bateson wrote In fact, what we mean by information - the elementary unit of information - is a Difference Which Makes A Difference.


Edited:    |       |    Search Twitter for discussion