美文网首页
ubuntu16.04本地配置spark history ser

ubuntu16.04本地配置spark history ser

作者: WJXZ | 来源:发表于2018-09-07 13:23 被阅读0次
1.安装spark
2.配置spark-defaults.conf
cd /usr/share/spark/spark-2.2.2-bin-hadoop2.7/conf
sudo cp spark-defaults.conf.template  spark-defaults.conf
sudo vim spark-defaults.conf
#末尾添加
spark.eventLog.enabled           true
spark.eventLog.dir               file:/usr/share/spark/spark-2.2.2-bin-hadoop2.7/spark-logs
spark.history.fs.logDirectory    file:/usr/share/spark/spark-2.2.2-bin-hadoop2.7/spark-logs
#保存退出 :wq
3.新建spark-logs日志文件夹
cd /usr/share/spark/spark-2.2.2-bin-hadoop2.7
sudo mkdir spark-logs
4.测试
#运行
pyspark

访问http://localhost:4040
看到spark页面则配置成功
若有多个sparkContext 则端口会从4040开始递增

相关文章

网友评论

      本文标题:ubuntu16.04本地配置spark history ser

      本文链接:https://www.haomeiwen.com/subject/dchcgftx.html