目录
[TOC]
1 案例
如导入的文件为phone_area.csv,文件格式如下:
"1","1300000","山东","济南","中国联通","531","250000"
"2","1300001","江苏","常州","中国联通","519","213000"
"3","1300002","安徽","巢湖","中国联通","565","238000"
定义配置文件phone_area_imp.conf:
input{
file {
path => ["/opt/test/phone_area.csv"]
start_position => "beginning"
}
}
filter{
csv {
separator => ","
columns => ["id","phone_first","phone_provence","phone_city","service","area_number","code"]
}
mutate {
convert => {
"id" => "string"
"phone_first" => "integer"
"phone_provence" => "string"
"phone_city" => "string"
"service" => "string"
"area_number" => "integer"
"code" => "integer"
}
}
}
output{
elasticsearch {
hosts => ["172.16.110.41:9200"]
index => "phone_area"
document_type => "o"
}
}
执行导入命令:
/opt/logstash-6.4.0/bin/logstash -f phone_area_imp.conf
导入界面,将一直停留在命令行,不退出
:
[2020-04-13T12:47:38,707][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x477b6a0 run>"}
[2020-04-13T12:47:39,002][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2020-04-13T12:47:39,060][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2020-04-13T12:47:40,733][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
当csv文件发生变动时,仍然会同步新数据到es中
。
网友评论