美文网首页人生几何?
How is Bayes Theorem derivated f

How is Bayes Theorem derivated f

作者: 宣雄民 | 来源:发表于2021-09-02 11:25 被阅读0次

    Bayes' theorem

    Bayes' theorem describes the probability of occurrence of an event related to any condition, which is also considered for the case of Conditional probability. It is the most important rule in data science, the mathematical rule that describes how to update a belief by given some evidence. In other words - it describes the act of learning.


    Conditional probabiltiy

    "Probability of event A and event B equals
    the probability of event A times the probability of event B given event A"

    Suppose we want to compute the intersection of Event A and Event B, which is A \cap B.

    \because A \cap B = P(A) * P(B|A)
    For P(B|A) is the probability of event B given the consideration of event A, thus P(A) * P(B|A) yields the red portion which is A \cap B.

    On the other hand, the formula can also be interpreted as follows
    A \cap B = P(B) * P(A|B) which will also result in the red portion A \cap B

    \therefore A \cap B = P(B|A) * P(A) = P(A| B) * P(B)
    P(B|A) = \frac{A \cap B}{P(A)}
    P(A|B) = \frac{A \cap B}{P (B)}


    Derivation

    \because A \cap B = P(B|A) * P(A) = P(A| B) * P(B)
    P(B|A) = \frac{A \cap B}{P(A)}
    P(A|B) = \frac{A \cap B}{P (B)}

    \therefore Bayes' theorem
    P(A|B) = \frac{P(B|A) * P(A)}{P(B)}
    For P(B) is the also called "total probability"

    Total Probability

    P(B) = P(A \cap B) + P(A^c \cap B)


    Interpretation

    The formula can be interpreted as follows

    P(A|E_i) = \frac{P(E_i|A)P(A)}{\Sigma_{k=1}^n{P(E_k|A)P(A)}}

    P(Hypothesis|Evidence) = P(Hypothesis) * \frac{P(Evidence|Hypothesis)}{P(Evidence)}

    • Posterior
      The updated probability after the evidence is considered.
    • Prior
      The probability before the evidence is even considered.
    • Likelihood
      The probability of the evidence, given the belief is true.
    • Marginal
      The probability of the evidence, under any circumstance

    Bayes' Rule lets you calculate the posterior (or "updated") probability. This is a conditional probability. It is the probability of the hypothesis being true if the evidence is present.

    Think of the prior (or "previous") probability as your belief in the hypothesis before seeing the new evidence. If you had a strong belief in the hypothesis already, the prior probability will be large.

    The prior is multiplied by a fraction. Think of this as the "strength or weight" of the evidence. The posterior probability is greater when the top part (numerator) is big, and the bottom part (denominator) is small.

    The numerator is the likelihood. This is another conditional probability. It is the probability of the evidence being present, given the hypothesis is true.

    The denominator is the marginal probability of the evidence. That is, it is the probability of the evidence being present, whether the hypothesis is true or false. The smaller the denominator, the more "convincing" the evidence.

    相关文章

      网友评论

        本文标题:How is Bayes Theorem derivated f

        本文链接:https://www.haomeiwen.com/subject/sfqxwltx.html