美文网首页
something about softmax

something about softmax

作者: 捡个七 | 来源:发表于2019-06-20 10:58 被阅读0次

[1]. Softmax vs. Softmax-Loss: Numerical Stability

function softmax(z)
  #z = z - maximum(z)
  o = exp(z)
  return o / sum(o)
end
function gradient_together(z, y)
  o = softmax(z)
  o[y] -= 1.0
  return o
end
function gradient_separated(z, y)
  o = softmax(z)
  ∂o_∂z = diagm(o) - o*o'
  ∂f_∂o = zeros(size(o))
  ∂f_∂o[y] = -1.0 / o[y]
  return ∂o_∂z * ∂f_∂o
end

[2]. 反向传播之一:softmax函数

利用这个特性,在计算之前减去最大值防止溢出,但实际计算的结果并不会受到影响。

[3]. PyTorch - VGG output layer - no softmax?

The reason why this is done is because you only need the softmax layer at the time of inferencing. While training, to calculate the loss you don’t need to softmax and just calculate loss without it. This way the number of computations get reduced!

相关文章

  • something about softmax

    [1]. Softmax vs. Softmax-Loss: Numerical Stability [2]. 反...

  • About Something

    关于情绪 曾经我以为我是可以将情绪管理的很好的人,我以为将情绪深藏在心底不表现出来就是很好的自控力。然而,实际的情...

  • about something

    有人说“感情里人总要蠢一次” 最近我突然想明白了我和sxy的为什么会变成这样。那段时间大概是她最难受的时候,男朋友...

  • 2019-08-03

    Something about protocol of swift

  • Shallowly

    About youI am just saying Saying something about youAbout...

  • Diary-2.2

    There is no point of worrying about something uncertain ,...

  • test

    180718 something about fifa Halo, Ivan

  • Something About Movies

    Today we had our last lessons with two experienced Americ...

  • something about you

    人生苦短 我就不再想你了

  • Something about seniority in the

    By Cynthia In Chinese, seniority refers to the po...

网友评论

      本文标题:something about softmax

      本文链接:https://www.haomeiwen.com/subject/xeplqctx.html