美文网首页
如何处理数据不均衡问题-Focal Loss

如何处理数据不均衡问题-Focal Loss

作者: dreampai | 来源:发表于2019-10-09 17:03 被阅读0次

gamma>0使得减少易分类样本的损失。使得更关注于困难的、错分的样本。
加入平衡因子alpha,用来平衡正负样本本身的比例不均:文中alpha取0.25,即正样本要比负样本占比小,这是因为负例易分。

# focal loss with multi label
def focal_loss(classes_num, gamma=2., alpha=.25, e=0.1):
    # classes_num contains sample number of each classes
    def focal_loss_fixed(target_tensor, prediction_tensor):
        '''
        prediction_tensor is the output tensor with shape [None, 100], where 100 is the number of classes
        target_tensor is the label tensor, same shape as predcition_tensor
        '''
        import tensorflow as tf
        from tensorflow.python.ops import array_ops
        from keras import backend as K

        #1# get focal loss with no balanced weight which presented in paper function (4)
        zeros = array_ops.zeros_like(prediction_tensor, dtype=prediction_tensor.dtype)
        one_minus_p = array_ops.where(tf.greater(target_tensor,zeros), target_tensor - prediction_tensor, zeros)
        FT = -1 * (one_minus_p ** gamma) * tf.log(tf.clip_by_value(prediction_tensor, 1e-8, 1.0))

        #2# get balanced weight alpha
        classes_weight = array_ops.zeros_like(prediction_tensor, dtype=prediction_tensor.dtype)

        total_num = float(sum(classes_num))
        classes_w_t1 = [ total_num / ff for ff in classes_num ]
        sum_ = sum(classes_w_t1)
        classes_w_t2 = [ ff/sum_ for ff in classes_w_t1 ]   #scale
        classes_w_tensor = tf.convert_to_tensor(classes_w_t2, dtype=prediction_tensor.dtype)
        classes_weight += classes_w_tensor

        alpha = array_ops.where(tf.greater(target_tensor, zeros), classes_weight, zeros)

        #3# get balanced focal loss
        balanced_fl = alpha * FT
        balanced_fl = tf.reduce_mean(balanced_fl)

        #4# add other op to prevent overfit
        # reference : https://spaces.ac.cn/archives/4493
        nb_classes = len(classes_num)
        fianal_loss = (1-e) * balanced_fl + e * K.categorical_crossentropy(K.ones_like(prediction_tensor)/nb_classes, prediction_tensor)

        return fianal_loss
    return focal_loss_fixed

代码链接

相关文章

  • 如何处理数据不均衡问题-Focal Loss

    gamma>0使得减少易分类样本的损失。使得更关注于困难的、错分的样本。加入平衡因子alpha,用来平衡正负样本本...

  • Focal loss特点和不均衡问题

    focal loss来源于论文《Focal loss for dense object detection》, ...

  • Focal loss 以及 pytorch实现

    Focal loss 是在 Focal Loss for Dense Object Detection[https...

  • 2019-03-27

    Focal Loss

  • Focus Loss

    Focal Loss主要是kaiming提出来的解决正负样本不均衡和Hard Mining的一种策略。

  • 训练技巧

    训练技巧 focal loss:解决类别不平衡、分类难度差异的问题,效果更好。 学习率下降 数据扩增找更多的标签数...

  • ICCV2017:Focal Loss for Dense Ob

    这篇有关Focal Loss的paper是何恺明大神提出的又一经典paper,除了提出Focal Loss还提出了...

  • Focal loss函数及代码

    一、Focal loss损失函数 Focal Loss的引入主要是为了解决**难易样本数量不平衡****(注意,有...

  • Focal Loss 原理及实践

    1 关于Focal Loss Focal Loss 是一个在交叉熵(CE)基础上改进的损失函数,来自ICCV201...

  • Focal Loss

    一、Focal Loss定义   关于指标αt的选择问题,论文中描述如下:  In practice,α may ...

网友评论

      本文标题:如何处理数据不均衡问题-Focal Loss

      本文链接:https://www.haomeiwen.com/subject/jkpgpctx.html