美文网首页
CMU Neural Nets for NLP

CMU Neural Nets for NLP

作者: 想加颜表情的tsi | 来源:发表于2019-03-18 13:32 被阅读0次

    1.

    Before class: read material on the topic

    ⭐️About code: GitHub: neubig/nn4nlp-code

    Assignment1: Text Classifier/Questionnaire

    Assignment2: SOTA Survey

    Assignment3: SOTA Re-implementation

    Assignment4: Final Project

    Bags of words:把一些词放进function,相加结果为score

    每个词向量都有feature,feature combination(如feature1 + feature5 = positive)

    体现为Computation graph(均可转化为图)

    (简)

    图中的结点X可以是{tensor, matrix, vector, scalar} value

    左上结点表示一个function,每条入边传入一个参数(边的起始顶点)

    一个例子(简)

    算法:

    Forward propagation正向传播

    Back-propagation (a loss function, a value we want to minimize

    parameter update

    神经网络框架:静态:theano, caffe, mxnet, tensorflow

          动态:aynet,chainer, pytorch

    ⭐️基本过程

    gitnub:第一个项目

    把word都先转化成整数,然后用向量和矩阵(对应下图)

    而对于continous bag of words,修改#define the model 部分即可(下面的其它代码)

    class plan:

    TOPIC 1:Model of words

    TOPIC 2: Model of sentences

    TOPIC 3: Implementing, Debugging, interpreting

    TOPIC 4: sequence-to-sequence models

    TOPIC 5: Structured Prediction Models

    等等

    2.

    Language models:can help score sentences ,generate sentences

    Problem1: similar words  ->class based language models

    Problem2: intervening words  -> skip-gram lm

    Problem3: Long-distance dependencies  ->cache

    Softmax:

    A computation graph view

    Loss function: a measure of how bad our predictions are

    Paramenter update: 为了减少损失,而进行平移之类的

    实战:02-lm

    相关文章

      网友评论

          本文标题:CMU Neural Nets for NLP

          本文链接:https://www.haomeiwen.com/subject/vtpvmqtx.html