美文网首页
Lecture 12 | (1/5) Recurrent Neu

Lecture 12 | (1/5) Recurrent Neu

作者: Ysgc | 来源:发表于2019-10-28 11:46 被阅读0次

    https://www.youtube.com/watch?v=YYNNTrSROa4&list=PLp-0K3kfddPwz13VqV1PaMXF6V6dYdEsj&index=15&t=0s

    (19fall)


    Nonlinear autoregressive exogenous model
    https://en.wikipedia.org/wiki/Nonlinear_autoregressive_exogenous_model firstly proposed by M.I. Jordan

    has infinite memory

    黄框 红框 黄框 红框

    X ---(W_0)---> Z_0 -----(activation)----> hidden ----(W_1)----> Z_1 -----(activation)----> Y <-----Div

    Z_0(T-1)----(W_11)-----↘
                  X ---(W_0)---> Z_0 -----(activation)----> hidden ----(W_1)----> Z_1 -----(activation)----> Y <-----Div

    the sign is += here!!! from future the past can be predicted, just as the future can be predicted by the past 2 separate RNNs with different directions, and combine them before the output layer

    相关文章

      网友评论

          本文标题:Lecture 12 | (1/5) Recurrent Neu

          本文链接:https://www.haomeiwen.com/subject/aqdsvctx.html