1. BERT
BERT工具能够快速的得到词向量表示。名称叫做: bert-as-service,只要调用该服务就能够得到我们想要的向量表示。
2. 安装 bert-as-service
1) 环境要求:
Python版本 >= 3.5,Tensorflow版本 >= 1.10
本地环境配置:Win10 ,Python = 3.7 ,Tensorflow = 1.13.1
服务器环境配置:centos ,Python = 3.6 ,Tensorflow = 1.13.1
2)安装服务端和客户端
执行
pip install -U bert-serving-server bert-serving-client
(pip install -i https://pypi.tuna.tsinghua.edu.cn/simple -U bert-serving-server bert-serving-client)
3. 启动 BERT 服务
1)下载预训练模型
github :https://github.com/google-research/bert/
-
BERT-Large, Uncased (Whole Word Masking)
: 24-layer, 1024-hidden, 16-heads, 340M parameters -
BERT-Large, Cased (Whole Word Masking)
: 24-layer, 1024-hidden, 16-heads, 340M parameters -
BERT-Base, Uncased
: 12-layer, 768-hidden, 12-heads, 110M parameters -
BERT-Large, Uncased
: 24-layer, 1024-hidden, 16-heads, 340M parameters -
BERT-Base, Cased
: 12-layer, 768-hidden, 12-heads , 110M parameters -
BERT-Large, Cased
: 24-layer, 1024-hidden, 16-heads, 340M parameters -
BERT-Base, Multilingual Cased (New, recommended)
: 104 languages, 12-layer, 768-hidden, 12-heads, 110M parameters -
BERT-Base, Multilingual Uncased (Orig, not recommended)
(Not recommended, useMultilingual Cased
instead): 102 languages, 12-layer, 768-hidden, 12-heads, 110M parameters -
BERT-Base, Chinese
: Chinese Simplified and Traditional, 12-layer, 768-hidden, 12-heads, 110M parameters
下载BERT-Base, Chinese
模型解压,放在根目录下
2)启动服务
解压缩后,运行如下命令进行启动,目录换成解压后的路径。(-num_worker指定使用多少个CPU)
bert-serving-start -model_dir /Users/mantch/Downloads/chinese_L-12_H-768_A-12 -num_worker=4
运行后会看到如下结果:
ckpt_name = bert_model.ckpt
config_name = bert_config.json
cors = *
cpu = False
device_map = []
do_lower_case = True
fixed_embed_length = False
fp16 = False
gpu_memory_fraction = 0.5
graph_tmp_dir = None
http_max_connect = 10
http_port = None
mask_cls_sep = False
max_batch_size = 256
max_seq_len = 25
model_dir = ./chinese_L-12_H-768_A-12
no_position_embeddings = False
no_special_token = False
num_worker = 4
pooling_layer = [-2]
pooling_strategy = REDUCE_MEAN
port = 5555
port_out = 5556
prefetch_size = 10
priority_batch_size = 16
show_tokens_to_client = False
tuned_model_dir = None
verbose = False
xla = False
I:WORKER-1:[__i:gen:559]:ready and listening!
I:WORKER-0:[__i:gen:559]:ready and listening!
I:WORKER-2:[__i:gen:559]:ready and listening!
I:WORKER-3:[__i:gen:559]:ready and listening!
I:VENTILATOR:[__i:_ru:164]:all set, ready to serve request!
port = 5555,port_out = 5556 为端口号信息,启动成功。
3)调用 bert 进行自己的应用
from bert_serving.client import BertClient
bc = BertClient(ip='localhost',check_version=False, check_length=False)
vec = bc.encode(['学习'])
print(vec)
提示:
liunx刚开始装的Python版本与本地相同,出现各种报错
错误1:ImportError: /lib64/libstdc++.so.6: version `CXXABI_1.3.8' not found (required by /root/anaconda3/en
错误2:ImportError: /lib64/libm.so.6: version `GLIBC_2.23' not found (required by /root/anaconda3/envs/tensorflow11/lib/python3.7/site-packages/tensorflow/python/_pywrap_tensorflow_internal.so)
因此将Python版本改为3.6,重新安装Tensorflow,启动服务成功。
网友评论