美文网首页
Lecture 14 | (3/5) Recurrent Neu

Lecture 14 | (3/5) Recurrent Neu

作者: Ysgc | 来源:发表于2019-11-02 13:44 被阅读0次

https://www.youtube.com/watch?v=ItYyu3KQvOQ

code generated by a RNN

n-1 x 100

only 1% space is used

inefficient!!!

an advantage and disadvantage project from N dim to M dim subspace

a learnable transformation!!!

time delayed neural network end up capturing some semantic relationships only consider the final error the strategy above works for the "many to one" case there're 2 problems another problem: how to train here's the recording of "hello", but no label for every time step: alignment problem solution: CTC

相关文章

网友评论

      本文标题:Lecture 14 | (3/5) Recurrent Neu

      本文链接:https://www.haomeiwen.com/subject/bmycbctx.html