美文网首页
读《James V Stone - Information Th

读《James V Stone - Information Th

作者: JerodYan | 来源:发表于2021-06-27 09:58 被阅读0次

    After Shannon's paper, it became apparent that information is a well-defined and measurable quantity.

    For any communication channel:

    1. There is a definite upper limit, the channel capacity, to the amount of information that can be communicated through that channel.
    2. This limit shrinks as the amount of noise in the channel increases.
    3. This limit can very nearly be reached by judicious packaging, or encoding, of data.

    One bit of information allows you to choose between two equally probable, or equiprobable, alternatives. For example, for a traveller who does not know the way, each fork in the road requires one bit of information to make a correct decision.

    A binary digit is the value of a binary variable, whereas a bit is an amount of information.

    The Shannon information or surprisal of each outcome is measured in bits. We are interested in how much surprise is associated with the entire set of possible values. More information is associated with the more surprising outcome.

    Entropy is a measure of uncertainty. Receiving an amount of information is equivalent to having exactly the same amount of entropy (uncertainty) taken away.

    The channel capacity C is the maximum amount of information that a channel can provide at its output about the input.

    1. the entropy H(x) of the input
    2. the entropy H(y) of the output
    3. the entropy H(eta) of the noise in the channel

    相关文章

      网友评论

          本文标题:读《James V Stone - Information Th

          本文链接:https://www.haomeiwen.com/subject/yyatultx.html