TextRNN

作者: 骑鲸公子_ | 来源:发表于2018-04-23 09:21 被阅读0次

    论文:Recurrent Neural Network for TextClassification with Multi-Task Learning

    1 Introduction

    DNN缺点:usually need a large-scale corpus due to the large number of parameters,it is hard to train a network that generalizes well with limited data

    The first model uses just one shared layer for all the tasks. 

    The second model uses different layers for different tasks, but each layer can read information from other layers.

    The third model not only assigns one specific layer for each task, but also builds a shared layer for all the tasks.


    2 Recurrent Neural Network for Specific-Task Text Classification

    2.1 Recurrent Neural Network

    缺陷:梯度消失

    RNN The activation of the hidden state ht

    LSTM :learning long-term dependencies

    LSTM The LSTM transition equations




    论文:A Bi-LSTM-RNNModel for Relation Classification Using Low-Cost Sequence Features

    1. Introduction

    ①performs bi-directional recurrent computation along all the tokens of the sentences which the relation spans. 

    ②the sequence of token representations, which are generated in the previous step, is divided into five parts according to the order that tokens occur in these sentences,picture1

    ③standard pooling functions are applied over the token representations of each part and we obtain five representations corresponding to the five parts.

    ④they are concatenated and fed into a softmaxlayer for relation classification

    picture1

    LSTMs are used to attenuate the gradientvanishing problem when two target entities aredistant in text.


    2. Related Work

    3. Our Bi-LSTM-RNN Model

    3.1. Long Short Term Memory (LSTM)

    equation1

    3.2. Bi-LSTM-RNN

    Bi-LSTM-RNN

    补充:

    RNN和LSTM

    RNN Tensorflow中的实现

    用TensorFlow构建RNN

    相关文章

      网友评论

          本文标题:TextRNN

          本文链接:https://www.haomeiwen.com/subject/usrkhftx.html