美文网首页
resnet,densenet的反向传播详解

resnet,densenet的反向传播详解

作者: 迷途的Go | 来源:发表于2018-10-20 11:14 被阅读0次

    resnet,densenet的反向传播详解

    正常网络的反向传播

    resnet_block.png

    z=x \bigotimes W1, h=\phi(z), o = h \bigotimes W2, L=(o-y)^2,

    初始化网络参数,给个初始值,经过前向传播,图中x,z,h,o的值都是已知的

    \frac{\partial L}{\partial W2}= \frac{\partial L}{\partial o}· \frac{\partial o}{\partial W2}, o,h是已知的,\frac{\partial o}{\partial W2}=h, \frac{\partial L}{\partial o}=2(o-y),代入公式得\frac{\partial L}{W2}=2h(o-y)

    \frac {\partial L}{\partial W1}=\frac{\partial L}{\partial o} \frac{\partial o}{\partial h} \frac{\partial h}{\partial z} \frac{\partial z}{\partial x}=\frac {\partial L}{\partial o} * W2 \bigotimes \frac{\partial \phi(z)}{\partial z} x^T=2(o-y) * W2 \bigotimes \frac{\partial \phi(z)}{\partial z} x^T

    resnet的反向传播

    一个正常的两层网络


    resnet_block.png

    对于一个正常的block,\frac{\partial L}{\partial l_1}=\frac{\partial L}{\partial l_2} \frac{\partial l_2}{\partial l_1}=\frac{\partial L}{\partial l_2} \frac{\partial l_2}{\partial o}W

    一个两层的resent block


    resnet_block1.png

    \frac{\partial L}{\partial l_1}=\frac{\partial L}{\partial l_2}\frac{\partial l_2}{\partial o}\frac{\partial o}{\partial l_1}=\frac{\partial L}{\partial l_2}\frac{\partial l_2}{\partial o} \frac{\partial(o_1+l_1)}{\partial l_1}=\frac{\partial L}{\partial l_2} \frac{\partial l_2}{\partial o}(W+1)

    和没有残差的两层block相对比,在W的位置多出了1,即梯度不衰减的传递回去了,避免梯度消失

    densenet的反向传播

    两层的densenet结构如下


    densenet_block.png

    从L反向传播到l_1梯度:

    \frac{\partial L}{\partial l_1}=\frac{\partial L}{\partial l_2}\frac{\partial l_2}{\partial o}\frac{\partial o}{\partial l_1}=\frac{\partial L}{\partial l_2}\frac{\partial l_2}{\partial o} \frac{\partial(o_1 cat\ l_1)}{\partial l_1}=\frac{\partial L}{\partial l_2} \frac{\partial l_2}{\partial o}(W cat\ 1)

    cat为运算算子,cat=concate,将两个张量按照维度拼接起来,W cat\ 1是在W上按照l_1的shape的1全部concate到W上

    相关文章

      网友评论

          本文标题:resnet,densenet的反向传播详解

          本文链接:https://www.haomeiwen.com/subject/lhnpzftx.html