美文网首页深度学习
Keras中objective fuction(损失函数):

Keras中objective fuction(损失函数):

作者: dopami | 来源:发表于2018-11-24 09:58 被阅读2次

    https://blog.csdn.net/meanme/article/details/50813719

    (1) mean-squared-error

        def mean_squared_error(y_true, y_pred):

            return K.mean(K.square(y_pred - y_true), axis=-1)

    (2) root-mean-squared-error

        def root_mean_squared_error(y_true, y_pred):

            return K.sqrt(K.mean(K.square(y_pred - y_true), axis=-1))

    (3) mean-absolute-error

        def mean_absolute_error(y_true, y_pred):

            return K.mean(K.abs(y_pred - y_true), axis=-1)

    (4) mean-absolute-percentage-error

        def mean_absolute_percentage_error(y_true, y_pred):

            diff = K.abs((y_true - y_pred) / K.clip(K.abs(y_true), K.epsilon(), np.inf))

            return 100. * K.mean(diff, axis=-1)

    (5) mean-squared-logarithmic-error

        def mean_squared_logarithmic_error(y_true, y_pred):

            first_log = K.log(K.clip(y_pred, K.epsilon(), np.inf) + 1.)

            second_log = K.log(K.clip(y_true, K.epsilon(), np.inf) + 1.)

            return K.mean(K.square(first_log - second_log), axis=-1)

    (6) squared-hinge

        def squared_hinge(y_true, y_pred):

            return K.mean(K.square(K.maximum(1. - y_true * y_pred, 0.)), axis=-1)

    (7) hinge(max-margin loss)

        def hinge(y_true, y_pred):

            return K.mean(K.maximum(1. - y_true * y_pred, 0.), axis=-1)

    (8) categorical-crossentropy

        def categorical_crossentropy(y_true, y_pred):

            '''Expects a binary class matrix instead of a vector of scalar classes.

            '''

            return K.mean(K.categorical_crossentropy(y_pred, y_true), axis=-1)

    单分类问题最常用的objective

    (9) binary-crossentropy

        def binary_crossentropy(y_true, y_pred):

            return K.mean(K.binary_crossentropy(y_pred, y_true), axis=-1)

    可以使网络最后一层的激活函数为sigmoid/tanh, 再将loss设置为此objective,则能够训练multi-label数据集。

    (10) poisson

        def poisson(y_true, y_pred):

            return K.mean(y_pred - y_true * K.log(y_pred + K.epsilon()), axis=-1)

    (11) cosine-proximity

        def cosine_proximity(y_true, y_pred):

            assert K.ndim(y_true) == 2

            assert K.ndim(y_pred) == 2

            y_true = K.l2_normalize(y_true, axis=1)

            y_pred = K.l2_normalize(y_pred, axis=1)

            return -K.mean(y_true * y_pred, axis=1)

    总结

    参数y_true即给定的label,y_pred为网络的输出。

    clip函数的作用就是对于给定输入X(n维均可),把其中小于min的值均设置成min,大于max的值均设置成max。可以运行下面的代码实验(numpy和theano的借口都差不多):

    x = np.empty((2,3,4,5))

    print x

    print np.clip(a=x,a_min=2,a_max=4)

    对于custom objective函数的定义,以下给出一个例子,直接返回y_pred和y_true的差值:

    def loss():

        return -np.abs(y_pred-y_true)

    调用的时候:

    model.compile(loss=loss(), optimizer=optimizer)

    hinge loss相关:

    ---------------------

    作者:Xiaomin-Wu

    来源:CSDN

    原文:https://blog.csdn.net/meanme/article/details/50813719

    版权声明:本文为博主原创文章,转载请附上博文链接!

    相关文章

      网友评论

        本文标题:Keras中objective fuction(损失函数):

        本文链接:https://www.haomeiwen.com/subject/ktekfqtx.html