Azkaban应用案例演示
1、Commend类型单一job示例
创建工程
![](https://img.haomeiwen.com/i14270006/b6e9b8cdc667f138.png)
![](https://img.haomeiwen.com/i14270006/b2b07063dc6200b4.png)
创建成功之后:
![](https://img.haomeiwen.com/i14270006/b33d199afb4ed8c3.png)
点击upload: 注意要是zip的压缩包
command.job内容
#command.job
type=command
command=echo 'hello azkaban'
![](https://img.haomeiwen.com/i14270006/9197ecaf1f8367db.png)
![](https://img.haomeiwen.com/i14270006/edd89ad880bd9e3d.png)
点击Execute Flow执行
![](https://img.haomeiwen.com/i14270006/c32b657183b1afb7.png)
点击Execute执行
![](https://img.haomeiwen.com/i14270006/0ecbb96c8cff0696.png)
![](https://img.haomeiwen.com/i14270006/4447e802c1fe4df6.png)
![](https://img.haomeiwen.com/i14270006/ac3278ef3c78518e.png)
点击上图中的Details,查看打印结果:
![](https://img.haomeiwen.com/i14270006/307c2fc7e0a782b4.png)
在自己配置的Projects目录下面可以看到上传的项目:
cd /home/bigdata/apps/azkaban-2.5.0/projects/1.1/
ll
![](https://img.haomeiwen.com/i14270006/7410c10aa6540a64.png)
2、Commend类型多job工作流flow
创建工程
![](https://img.haomeiwen.com/i14270006/12bd97755d403b6c.png)
上传依赖文件
stepone.job
# stepone.job
type=command
command=echo stepone
steptwo.job
# steptwo.job
type=command
dependencies=stepone
command=echo steptwo
![](https://img.haomeiwen.com/i14270006/11e4357aef42dcff.png)
![](https://img.haomeiwen.com/i14270006/eed3ef439c03f588.png)
点击执行
![](https://img.haomeiwen.com/i14270006/499b1fc6b79ba20e.png)
点击Execute
![](https://img.haomeiwen.com/i14270006/7df0b7a77065e7ed.png)
执行结果:
![](https://img.haomeiwen.com/i14270006/22974b7d5b3a9bf1.png)
可以点击Details看任务结果。
![](https://img.haomeiwen.com/i14270006/5f8790148c649979.png)
3、操作HDFS任务
创建工程
![](https://img.haomeiwen.com/i14270006/9c48ac766583cee4.png)
上传压缩包
hdfs.job
# hdfs.job
type=command
command=hadoop fs -mkdir -p /hello/azkaban
![](https://img.haomeiwen.com/i14270006/4a3790d9b6297266.png)
![](https://img.haomeiwen.com/i14270006/b5f447e52e5dbd4d.png)
执行成功:
![](https://img.haomeiwen.com/i14270006/b29d74389245d0f9.png)
![](https://img.haomeiwen.com/i14270006/d2960236eebd6a3c.png)
查看结果:
![](https://img.haomeiwen.com/i14270006/659c25582171778f.png)
4、操作MapReduce任务
创建工程
![](https://img.haomeiwen.com/i14270006/f0f040978f9cc25e.png)
上传文件
mapreduce_pi.job
# mapreduce_pi.job
type=command
command=hadoop jar hadoop-mapreduce-examples-2.7.7.jar pi 5 1000
mapreduce_wordcount.job
# mapreduce_wordcount.job
type=command
dependencies=mapreduce_pi
command=hadoop jar hadoop-mapreduce-examples-2.7.7.jar wordcount /wordcount.txt /wordcount/output_azkaban
hadoop-mapreduce-examples-2.7.7.jar
![](https://img.haomeiwen.com/i14270006/fa869aa83518b1c4.png)
![](https://img.haomeiwen.com/i14270006/1c26228d1ba809a9.png)
执行中:
![](https://img.haomeiwen.com/i14270006/ea853a02228c1bb5.png)
![](https://img.haomeiwen.com/i14270006/82e51a01fc437722.png)
执行完毕:
![](https://img.haomeiwen.com/i14270006/419ac9e069982fff.png)
可以看到执行中和执行完毕颜色不同
![](https://img.haomeiwen.com/i14270006/019e0bb903d69a6c.png)
到HDFS中查看运行结果:
![](https://img.haomeiwen.com/i14270006/9a4583e40bd7e357.png)
![](https://img.haomeiwen.com/i14270006/2af58ca5d24aa95d.png)
网友评论