美文网首页NLPrasa
Rasa入门指南

Rasa入门指南

作者: 魏鹏飞 | 来源:发表于2019-07-29 11:30 被阅读0次

    1. Rasa是什么?

    Rasa是一个开源机器学习框架,用于构建语境化AI助手和聊天机器人。

    Rasa有两个主要的模块:

    • NLU 用于理解用户的消息
    • Core 用于管理对话并且决定下一步做什么

    RasaX是一个帮你构建、开发、部署AI助手的工具借助Rasa框架,RasaX包含一个用户接口和一个REST API。

    Rasa+Rasa X

    2.使用Rasa构建AI助手

    rasa init命令创建Rasa项目脚手架。这包括培训数据和一些配置文件。它还将使用一些示例训练数据训练您的第一个模型。

    rasa init --no-prompt
    

    目录结构:

    项目目录
    # 执行结果
    
    Welcome to Rasa! 🤖
    
    To get started quickly, an initial project will be created.
    If you need some help, check out the documentation at https://rasa.com/docs/rasa.
    
    Created project directory at '/Users/weipengfei/workspaces/RasaProjects'.
    Finished creating project structure.
    Training an initial model...
    Training Core model...
    Processed Story Blocks: 100%|█| 4/4 [00:00<00:00, 4236.67it/s, # trackers=1]
    Processed Story Blocks: 100%|█| 4/4 [00:00<00:00, 2035.58it/s, # trackers=4]
    Processed Story Blocks: 100%|█| 4/4 [00:00<00:00, 641.23it/s, # trackers=12]
    Processed Story Blocks: 100%|██| 4/4 [00:00<00:00, 934.09it/s, # trackers=7]
    Processed trackers: 100%|█████| 4/4 [00:00<00:00, 2861.05it/s, # actions=14]
    Processed actions: 14it [00:00, 8074.84it/s, # examples=14]
    Processed trackers: 100%|███| 94/94 [00:00<00:00, 1608.27it/s, # actions=62]
    _________________________________________________________________
    Layer (type)                 Output Shape              Param #   
    =================================================================
    masking (Masking)            (None, 5, 19)             0         
    _________________________________________________________________
    lstm (LSTM)                  (None, 32)                6656      
    _________________________________________________________________
    dense (Dense)                (None, 13)                429       
    _________________________________________________________________
    activation (Activation)      (None, 13)                0         
    =================================================================
    Total params: 7,085
    Trainable params: 7,085
    Non-trainable params: 0
    _________________________________________________________________
    2019-07-29 11:10:29 INFO     rasa.core.policies.keras_policy  - Fitting model with 62 total samples and a validation split of 0.1
    Epoch 1/100
    62/62 [==============================] - 1s 12ms/sample - loss: 2.5240 - acc: 0.2903
    Epoch 2/100
    62/62 [==============================] - 0s 188us/sample - loss: 2.4984 - acc: 0.3226
    Epoch 3/100
    62/62 [==============================] - 0s 194us/sample - loss: 2.4557 - acc: 0.3548
    Epoch 4/100
    62/62 [==============================] - 0s 199us/sample - loss: 2.4287 - acc: 0.3548
    Epoch 5/100
    62/62 [==============================] - 0s 247us/sample - loss: 2.4104 - acc: 0.4194
    Epoch 6/100
    62/62 [==============================] - 0s 192us/sample - loss: 2.3961 - acc: 0.4032
    Epoch 7/100
    62/62 [==============================] - 0s 205us/sample - loss: 2.3686 - acc: 0.4355
    Epoch 8/100
    62/62 [==============================] - 0s 186us/sample - loss: 2.3098 - acc: 0.4194
    Epoch 9/100
    62/62 [==============================] - 0s 194us/sample - loss: 2.2944 - acc: 0.4194
    Epoch 10/100
    62/62 [==============================] - 0s 197us/sample - loss: 2.2666 - acc: 0.4516
    Epoch 11/100
    62/62 [==============================] - 0s 186us/sample - loss: 2.2202 - acc: 0.4355
    Epoch 12/100
    62/62 [==============================] - 0s 200us/sample - loss: 2.2005 - acc: 0.4355
    Epoch 13/100
    62/62 [==============================] - 0s 186us/sample - loss: 2.1775 - acc: 0.4355
    Epoch 14/100
    62/62 [==============================] - 0s 204us/sample - loss: 2.1157 - acc: 0.4355
    Epoch 15/100
    62/62 [==============================] - 0s 187us/sample - loss: 2.0968 - acc: 0.4355
    Epoch 16/100
    62/62 [==============================] - 0s 221us/sample - loss: 2.0557 - acc: 0.4355
    Epoch 17/100
    62/62 [==============================] - 0s 188us/sample - loss: 2.0413 - acc: 0.4355
    Epoch 18/100
    62/62 [==============================] - 0s 186us/sample - loss: 2.0061 - acc: 0.4355
    Epoch 19/100
    62/62 [==============================] - 0s 191us/sample - loss: 1.9796 - acc: 0.4355
    Epoch 20/100
    62/62 [==============================] - 0s 180us/sample - loss: 1.9174 - acc: 0.4355
    Epoch 21/100
    62/62 [==============================] - 0s 197us/sample - loss: 1.9245 - acc: 0.4355
    Epoch 22/100
    62/62 [==============================] - 0s 182us/sample - loss: 1.8879 - acc: 0.4355
    Epoch 23/100
    62/62 [==============================] - 0s 220us/sample - loss: 1.8528 - acc: 0.4355
    Epoch 24/100
    62/62 [==============================] - 0s 182us/sample - loss: 1.7882 - acc: 0.4355
    Epoch 25/100
    62/62 [==============================] - 0s 189us/sample - loss: 1.8093 - acc: 0.4355
    Epoch 26/100
    62/62 [==============================] - 0s 181us/sample - loss: 1.7950 - acc: 0.4355
    Epoch 27/100
    62/62 [==============================] - 0s 193us/sample - loss: 1.7785 - acc: 0.4355
    Epoch 28/100
    62/62 [==============================] - 0s 197us/sample - loss: 1.7653 - acc: 0.4355
    Epoch 29/100
    62/62 [==============================] - 0s 221us/sample - loss: 1.7613 - acc: 0.4355
    Epoch 30/100
    62/62 [==============================] - 0s 216us/sample - loss: 1.7423 - acc: 0.4355
    Epoch 31/100
    62/62 [==============================] - 0s 185us/sample - loss: 1.7363 - acc: 0.4355
    Epoch 32/100
    62/62 [==============================] - 0s 238us/sample - loss: 1.6864 - acc: 0.4355
    Epoch 33/100
    62/62 [==============================] - 0s 198us/sample - loss: 1.6771 - acc: 0.4355
    Epoch 34/100
    62/62 [==============================] - 0s 191us/sample - loss: 1.6826 - acc: 0.4355
    Epoch 35/100
    62/62 [==============================] - 0s 180us/sample - loss: 1.6655 - acc: 0.4355
    Epoch 36/100
    62/62 [==============================] - 0s 190us/sample - loss: 1.6314 - acc: 0.4355
    Epoch 37/100
    62/62 [==============================] - 0s 183us/sample - loss: 1.6295 - acc: 0.4355
    Epoch 38/100
    62/62 [==============================] - 0s 181us/sample - loss: 1.5983 - acc: 0.4355
    Epoch 39/100
    62/62 [==============================] - 0s 207us/sample - loss: 1.6012 - acc: 0.4355
    Epoch 40/100
    62/62 [==============================] - 0s 230us/sample - loss: 1.5771 - acc: 0.4355
    Epoch 41/100
    62/62 [==============================] - 0s 227us/sample - loss: 1.5864 - acc: 0.4355
    Epoch 42/100
    62/62 [==============================] - 0s 214us/sample - loss: 1.5663 - acc: 0.4355
    Epoch 43/100
    62/62 [==============================] - 0s 227us/sample - loss: 1.5479 - acc: 0.4355
    Epoch 44/100
    62/62 [==============================] - 0s 244us/sample - loss: 1.5487 - acc: 0.4355
    Epoch 45/100
    62/62 [==============================] - 0s 237us/sample - loss: 1.5426 - acc: 0.4355
    Epoch 46/100
    62/62 [==============================] - 0s 243us/sample - loss: 1.5315 - acc: 0.4355
    Epoch 47/100
    62/62 [==============================] - 0s 246us/sample - loss: 1.5264 - acc: 0.4355
    Epoch 48/100
    62/62 [==============================] - 0s 241us/sample - loss: 1.5074 - acc: 0.4355
    Epoch 49/100
    62/62 [==============================] - 0s 241us/sample - loss: 1.5014 - acc: 0.4355
    Epoch 50/100
    62/62 [==============================] - 0s 212us/sample - loss: 1.4918 - acc: 0.4355
    Epoch 51/100
    62/62 [==============================] - 0s 234us/sample - loss: 1.5033 - acc: 0.4355
    Epoch 52/100
    62/62 [==============================] - 0s 233us/sample - loss: 1.4698 - acc: 0.4355
    Epoch 53/100
    62/62 [==============================] - 0s 216us/sample - loss: 1.4486 - acc: 0.4355
    Epoch 54/100
    62/62 [==============================] - 0s 277us/sample - loss: 1.4537 - acc: 0.4355
    Epoch 55/100
    62/62 [==============================] - 0s 214us/sample - loss: 1.4533 - acc: 0.4355
    Epoch 56/100
    62/62 [==============================] - 0s 224us/sample - loss: 1.4438 - acc: 0.4355
    Epoch 57/100
    62/62 [==============================] - 0s 252us/sample - loss: 1.4295 - acc: 0.4355
    Epoch 58/100
    62/62 [==============================] - 0s 289us/sample - loss: 1.4214 - acc: 0.4355
    Epoch 59/100
    62/62 [==============================] - 0s 247us/sample - loss: 1.4170 - acc: 0.4355
    Epoch 60/100
    62/62 [==============================] - 0s 213us/sample - loss: 1.4095 - acc: 0.4355
    Epoch 61/100
    62/62 [==============================] - 0s 302us/sample - loss: 1.3916 - acc: 0.4355
    Epoch 62/100
    62/62 [==============================] - 0s 232us/sample - loss: 1.3877 - acc: 0.4355
    Epoch 63/100
    62/62 [==============================] - 0s 228us/sample - loss: 1.3765 - acc: 0.4355
    Epoch 64/100
    62/62 [==============================] - 0s 299us/sample - loss: 1.3811 - acc: 0.4355
    Epoch 65/100
    62/62 [==============================] - 0s 256us/sample - loss: 1.3795 - acc: 0.4355
    Epoch 66/100
    62/62 [==============================] - 0s 291us/sample - loss: 1.3574 - acc: 0.4355
    Epoch 67/100
    62/62 [==============================] - 0s 254us/sample - loss: 1.3492 - acc: 0.4355
    Epoch 68/100
    62/62 [==============================] - 0s 209us/sample - loss: 1.3499 - acc: 0.4355
    Epoch 69/100
    62/62 [==============================] - 0s 209us/sample - loss: 1.3304 - acc: 0.4355
    Epoch 70/100
    62/62 [==============================] - 0s 215us/sample - loss: 1.3185 - acc: 0.4355
    Epoch 71/100
    62/62 [==============================] - 0s 228us/sample - loss: 1.3221 - acc: 0.4677
    Epoch 72/100
    62/62 [==============================] - 0s 261us/sample - loss: 1.3000 - acc: 0.4677
    Epoch 73/100
    62/62 [==============================] - 0s 209us/sample - loss: 1.2968 - acc: 0.4516
    Epoch 74/100
    62/62 [==============================] - 0s 225us/sample - loss: 1.3253 - acc: 0.4677
    Epoch 75/100
    62/62 [==============================] - 0s 234us/sample - loss: 1.2877 - acc: 0.4677
    Epoch 76/100
    62/62 [==============================] - 0s 202us/sample - loss: 1.2892 - acc: 0.4839
    Epoch 77/100
    62/62 [==============================] - 0s 221us/sample - loss: 1.2595 - acc: 0.4839
    Epoch 78/100
    62/62 [==============================] - 0s 200us/sample - loss: 1.2663 - acc: 0.4839
    Epoch 79/100
    62/62 [==============================] - 0s 221us/sample - loss: 1.2466 - acc: 0.5000
    Epoch 80/100
    62/62 [==============================] - 0s 216us/sample - loss: 1.2508 - acc: 0.4839
    Epoch 81/100
    62/62 [==============================] - 0s 189us/sample - loss: 1.2334 - acc: 0.4677
    Epoch 82/100
    62/62 [==============================] - 0s 223us/sample - loss: 1.2180 - acc: 0.4839
    Epoch 83/100
    62/62 [==============================] - 0s 227us/sample - loss: 1.2409 - acc: 0.4677
    Epoch 84/100
    62/62 [==============================] - 0s 218us/sample - loss: 1.2258 - acc: 0.4677
    Epoch 85/100
    62/62 [==============================] - 0s 249us/sample - loss: 1.1977 - acc: 0.4839
    Epoch 86/100
    62/62 [==============================] - 0s 210us/sample - loss: 1.2270 - acc: 0.4839
    Epoch 87/100
    62/62 [==============================] - 0s 217us/sample - loss: 1.2157 - acc: 0.5161
    Epoch 88/100
    62/62 [==============================] - 0s 216us/sample - loss: 1.1658 - acc: 0.5161
    Epoch 89/100
    62/62 [==============================] - 0s 202us/sample - loss: 1.1864 - acc: 0.4839
    Epoch 90/100
    62/62 [==============================] - 0s 205us/sample - loss: 1.1830 - acc: 0.5161
    Epoch 91/100
    62/62 [==============================] - 0s 216us/sample - loss: 1.1624 - acc: 0.5161
    Epoch 92/100
    62/62 [==============================] - 0s 192us/sample - loss: 1.1616 - acc: 0.5323
    Epoch 93/100
    62/62 [==============================] - 0s 212us/sample - loss: 1.1494 - acc: 0.5484
    Epoch 94/100
    62/62 [==============================] - 0s 189us/sample - loss: 1.1198 - acc: 0.5323
    Epoch 95/100
    62/62 [==============================] - 0s 210us/sample - loss: 1.1305 - acc: 0.5484
    Epoch 96/100
    62/62 [==============================] - 0s 222us/sample - loss: 1.1220 - acc: 0.5484
    Epoch 97/100
    62/62 [==============================] - 0s 234us/sample - loss: 1.0849 - acc: 0.5968
    Epoch 98/100
    62/62 [==============================] - 0s 198us/sample - loss: 1.1161 - acc: 0.5645
    Epoch 99/100
    62/62 [==============================] - 0s 195us/sample - loss: 1.1265 - acc: 0.5484
    Epoch 100/100
    62/62 [==============================] - 0s 216us/sample - loss: 1.1141 - acc: 0.5484
    2019-07-29 11:10:32 INFO     rasa.core.policies.keras_policy  - Done fitting keras policy model
    2019-07-29 11:10:33 INFO     rasa.core.agent  - Persisted model to '/var/folders/9y/xksbgbfx79sgdrf18b6vh6gw0000gn/T/tmplricej0h/core'
    Core model training completed.
    Training NLU model...
    2019-07-29 11:10:33 INFO     rasa.nlu.training_data.loading  - Training data format of /var/folders/9y/xksbgbfx79sgdrf18b6vh6gw0000gn/T/tmpegquajbg/7fd2d266caa343cfa3332f6de40b69dd_nlu.md is md
    2019-07-29 11:10:33 INFO     rasa.nlu.training_data.training_data  - Training data stats: 
        - intent examples: 39 (6 distinct intents)
        - Found intents: 'greet', 'mood_great', 'goodbye', 'deny', 'affirm', 'mood_unhappy'
        - entity examples: 0 (0 distinct entities)
        - found entities: 
    
    2019-07-29 11:10:33 INFO     rasa.nlu.model  - Starting to train component WhitespaceTokenizer
    2019-07-29 11:10:33 INFO     rasa.nlu.model  - Finished training component.
    2019-07-29 11:10:33 INFO     rasa.nlu.model  - Starting to train component RegexFeaturizer
    2019-07-29 11:10:33 INFO     rasa.nlu.model  - Finished training component.
    2019-07-29 11:10:33 INFO     rasa.nlu.model  - Starting to train component CRFEntityExtractor
    2019-07-29 11:10:33 INFO     rasa.nlu.model  - Finished training component.
    2019-07-29 11:10:33 INFO     rasa.nlu.model  - Starting to train component EntitySynonymMapper
    2019-07-29 11:10:33 INFO     rasa.nlu.model  - Finished training component.
    2019-07-29 11:10:33 INFO     rasa.nlu.model  - Starting to train component CountVectorsFeaturizer
    2019-07-29 11:10:33 INFO     rasa.nlu.model  - Finished training component.
    2019-07-29 11:10:33 INFO     rasa.nlu.model  - Starting to train component EmbeddingIntentClassifier
    2019-07-29 11:10:33 INFO     rasa.nlu.classifiers.embedding_intent_classifier  - Accuracy is updated every 10 epochs
    Epochs: 100%|████████████████████████████████████████████████████████████████████████| 300/300 [00:01<00:00, 276.65it/s, loss=0.091, acc=1.000]
    2019-07-29 11:10:34 INFO     rasa.nlu.classifiers.embedding_intent_classifier  - Finished training embedding classifier, loss=0.091, train accuracy=1.000
    2019-07-29 11:10:34 INFO     rasa.nlu.model  - Finished training component.
    2019-07-29 11:10:35 INFO     rasa.nlu.model  - Successfully saved model into '/var/folders/9y/xksbgbfx79sgdrf18b6vh6gw0000gn/T/tmplricej0h/nlu'
    NLU model training completed.
    Your Rasa model is trained and saved at '/Users/weipengfei/workspaces/RasaProjects/models/20190729-111027.tar.gz'.
    If you want to speak to the assistant, run 'rasa shell' at any time inside the project directory.
    

    运行shell:

    rasa shell
    
    # 结果
    ![rasa shell](https://img.haomeiwen.com/i4905462/3ce80ac380b34182.png?imageMogr2/auto-orient/strip%7CimageView2/2/w/1240)
    
    

    3. 使用Rasa X学习真实对话

    # 安装
    pip install rasa-x --extra-index-url https://pypi.rasa.com/simple
    # 执行
    cd RasaProjects & rasa x
    

    效果:

    Rasa X启动后效果图

    相关文章

      网友评论

        本文标题:Rasa入门指南

        本文链接:https://www.haomeiwen.com/subject/vwrvrctx.html