美文网首页
交叉熵的含义

交叉熵的含义

作者: 美环花子若野 | 来源:发表于2018-06-08 10:44 被阅读14次

    Cross entropy is a measure of similarity between two distributions. Since the classification models used in deep learning typically output probabilities for each class, we can compare the true class (distribution p) with the probabilities of each class given by the model (distribution q). The more similar the two distributions, the smaller our cross entropy will be.

    相关文章

      网友评论

          本文标题:交叉熵的含义

          本文链接:https://www.haomeiwen.com/subject/fpbosftx.html