美文网首页
2019-01-15[Stay Sharp]Maximum li

2019-01-15[Stay Sharp]Maximum li

作者: 三千雨点 | 来源:发表于2019-01-15 22:31 被阅读2次

Maximum likelihood estimation

Maximum likelihood estimation (MLE) is a method of finding the parameters values that maximize the likelihood function.

For likelihood function\mathcal { L } ( \theta ; x ), MLE will find the parameter \theta to maximize \mathcal{L (\theta; x)}:

\hat { \theta } \in \{ \underset { \theta \in \Theta } { \arg \max } \mathcal { L } ( \theta ; x ) \}
Sometime, we transform the likelihood function with natural logarithm, for log is a strictly increasing function, we call the transformed function log-likelihood (\ell ( \theta ; x ) = \ln \mathcal { L } ( \theta ; x )) or average log-likelihood(\hat { \ell } ( \theta ; x ) = \frac { 1 } { n } \ln \mathcal { L } ( \theta ; x ))

If data is independent and identically distributed, the average log-likedlihood turns to

\hat { \ell } ( \theta ; x ) = \frac { 1 } { n } \sum _ { i = 1 } ^ { n } \ln f \left( x _ { i } | \theta \right)

References

https://en.wikipedia.org/wiki/Maximum_likelihood_estimation

https://towardsdatascience.com/probability-concepts-explained-maximum-likelihood-estimation-c7b4342fdbb1

相关文章

网友评论

      本文标题:2019-01-15[Stay Sharp]Maximum li

      本文链接:https://www.haomeiwen.com/subject/uhiudqtx.html