After Shannon's paper, it became apparent that information is a well-defined and measurable quantity.
For any communication channel:
- There is a definite upper limit, the channel capacity, to the amount of information that can be communicated through that channel.
- This limit shrinks as the amount of noise in the channel increases.
- This limit can very nearly be reached by judicious packaging, or encoding, of data.
One bit of information allows you to choose between two equally probable, or equiprobable, alternatives. For example, for a traveller who does not know the way, each fork in the road requires one bit of information to make a correct decision.
A binary digit is the value of a binary variable, whereas a bit is an amount of information.
The Shannon information or surprisal of each outcome is measured in bits. We are interested in how much surprise is associated with the entire set of possible values. More information is associated with the more surprising outcome.
Entropy is a measure of uncertainty. Receiving an amount of information is equivalent to having exactly the same amount of entropy (uncertainty) taken away.
The channel capacity C is the maximum amount of information that a channel can provide at its output about the input.
- the entropy H(x) of the input
- the entropy H(y) of the output
- the entropy H(eta) of the noise in the channel
网友评论