下面介绍常见的激活函数
激活函数都在tf.nn中
最常见的是relu函数,它将所有小于0的值置为0
可以看到relu在的输出可以趋近正无穷,这样的情况有可能导致梯度消失或者是梯度爆炸,relu6函数是在relu基础上加上上界
显然6是上界
relu6其次是sigmoid函数,但是因为其不是zero-center,所以现在很少直接用这个函数做激活函数
sigmoid因为sigmoid不是zero-center的,tanh函数和sigmoid类似,但是是zero-center的
softsign:这个函数常用于作为激活函数,其可以看做是符号函数的平滑近似
The softsign function also gets used as an activation function. The form of thisfunction is x/(abs(x) + 1). The softsign function is supposed to be a continuous approximation to the sign function. It appears as follows
softplus:可以看做是relu的一种平滑近似
the softplus, is a smooth version of the ReLU function. The form of this function is log(exp(x) + 1)
Exponential Linear Unit (ELU) is very similar to the softplus function except that the bottom asymptote is -1 instead of 0. The form is (exp(x)+1) if x < 0 else x. It appears as follows
也就是说elu的下限为-1,是relu的平滑近似
对比效果图:
网友评论