美文网首页chatgml3-6b
chatglm3-6b 模型部署+结合flask进行调用

chatglm3-6b 模型部署+结合flask进行调用

作者: a十二_4765 | 来源:发表于2023-11-02 09:39 被阅读0次

    模型下载
    git lfs install
    git clone https://www.modelscope.cn/ZhipuAI/chatglm3-6b.git
    下载完成后找到所在目录

    from flask import  Flask,jsonify,request
    from transformers import AutoTokenizer, AutoModel
    app = Flask(__name__)
    tokenizer = AutoTokenizer.from_pretrained("E:/yi\chatglm3-6b", trust_remote_code=True)
    model = AutoModel.from_pretrained("E:/yi\chatglm3-6b", trust_remote_code=True).half().cuda()
    model = model.eval()
    @app.route('/goodsApi',methods=['get'])
    def transaction():
        item ={}
    
        title = request.args.get("title")
        title1 = request.args.get("title1")
        
        response, history = model.chat(tokenizer,title, history=[])
        response, history = model.chat(tokenizer, title+'和'+title1+'  这两句话的相似度是多少', history=[])
    
        item['data'] =response
    
        return item,200
    
    
    if __name__ == '__main__':
            app.run(host='0.0.0.0', port=5000)
    

    最后访问地址因为是开的5000端口所以本地访问5000端口

    http://127.0.0.1:5000/goodsApi?title=hello&title1=hello123

    相关文章

      网友评论

        本文标题:chatglm3-6b 模型部署+结合flask进行调用

        本文链接:https://www.haomeiwen.com/subject/aibiidtx.html