Cross entropy is a measure of similarity between two distributions. Since the classification models used in deep learning typically output probabilities for each class, we can compare the true class (distribution p) with the probabilities of each class given by the model (distribution q). The more similar the two distributions, the smaller our cross entropy will be.
网友评论