94.1 演示环境介绍
- 集群已启用Kerberos
- CM和CDH版本:5.13.1
94.2 操作演示
1.环境准备
测试ooziejob.sh脚本
#!/bin/bash
name=$1
echo "hello $name" >> /tmp/oozieshell.log
ooziejob.sh上传到HDFS的/faysontest/jars目录
kinit fayson
klist
hadoop fs -mkdir -p hadoop fs -mkdir -p /user/fayson/oozie/shellaction/lib
hadoop fs -put ooziejob.sh /user/fayson/oozie/shellaction/lib
hadoop fs -ls /user/fayson/oozie/shellaction/lib
- Shell Action的workflow.xml文件:
- workflow.xml文件中使用的参数配置为动态参数,会在后面的代码中指定该参数的值
<workflow-app name="ShellWorkflow" xmlns="uri:oozie:workflow:0.5">
<start to="shell-d9b6"/>
<kill name="Kill">
<message>Action failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<action name="shell-d9b6">
<shell xmlns="uri:oozie:shell-action:0.1">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<exec>${exec}</exec>
<argument>${argument}</argument>
<capture-output/>
</shell>
<ok to="End"/>
<error to="Kill"/>
</action>
<end name="End"/>
</workflow-app>
workflow.xml文件上传至HDFS的/user/faysontest/oozie/shellaction目录下
[root@ip-186-31-16-68 ~]# klist
[root@ip-186-31-16-68 ~]# hadoop fs -mkdir -p /user/fayson/oozie/shellaction
[root@ip-186-31-16-68 ~]# hadoop fs -put workflow.xml /user/fayson/oozie/shellaction
[root@ip-186-31-16-68 ~]# hadoop fs -ls /user/fayson/oozie/shellaction
准备JAAS文件oozie-login.conf,内容如下
com.sun.security.jgss.initiate {
com.sun.security.auth.module.Krb5LoginModule required
storeKey=true
useKeyTab=true
debug=true
keyTab="/Volumes/Transcend/keytab/fayson.keytab"
principal="fayson@FAYSON.COM";
};
2.示例创建Maven
工程pom.xml文件内容如下
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<parent>
<artifactId>cdh-project</artifactId>
<groupId>com.cloudera</groupId>
<version>1.0-SNAPSHOT</version>
</parent>
<modelVersion>4.0.0</modelVersion>
<artifactId>oozie-demo</artifactId>
<packaging>jar</packaging>
<name>oozie-demo</name>
<url>http://maven.apache.org</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
<version>4.5.4</version>
</dependency>
<dependency>
<groupId>net.sourceforge.spnego</groupId>
<artifactId>spnego</artifactId>
<version>7.0</version>
</dependency>
<dependency>
<groupId>org.apache.oozie</groupId>
<artifactId>oozie-client</artifactId>
<version>4.1.0</version>
</dependency>
</dependencies>
</project>
3.Oozie示例
编写ShellWorkflowDemo.java,示例代码如下
package com.cloudera.kerberos;
import org.apache.oozie.client.AuthOozieClient;
import org.apache.oozie.client.WorkflowAction;
import org.apache.oozie.client.WorkflowJob;
import java.util.List;
import java.util.Properties;
/**
* package: com.cloudera.kerberos
* describe: 使用Oozie-client的API接口向Kerberos集群提交Shell Action作业
* creat_user: Fayson
* creat_date: 2018/3/15
* creat_time: 下午11:10
* 公众号:碧茂大数据
*/
public class ShellWorkflowDemo {
private static String oozieURL = "http://ip-186-31-16-68.ap-southeast-1.compute.internal:11000/oozie";
public static void main(String[] args) {
System.setProperty("java.security.krb5.conf", "/Volumes/Transcend/keytab/krb5.conf");
System.setProperty("javax.security.auth.useSubjectCredsOnly", "false");
System.setProperty("ssun.security.jgss.debug", "true"); //Kerberos Debug模式
System.setProperty("java.security.auth.login.config", "/Volumes/Transcend/keytab/oozie-login.conf");
AuthOozieClient oozieClient = new AuthOozieClient(oozieURL, AuthOozieClient.AuthType.KERBEROS.name());
oozieClient.setDebugMode(1);
try {
Properties properties = oozieClient.createConfiguration();
properties.put("oozie.wf.application.path", "${nameNode}/user/fayson/oozie/shellaction");
properties.put("oozie.use.system.libpath", "True");
properties.put("nameNode", "hdfs://nameservice1");
properties.put("jobTracker", "ip-186-31-16-68.ap-southeast-1.compute.internal:8032");
properties.put("exec", "lib/ooziejob.sh");
properties.put("argument", "fayson");
//运行workflow
String jobid = oozieClient.run(properties);
System.out.println(jobid);
//等待10s
new Thread(){
public void run() {
try {
Thread.sleep(10000l);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
}.start();
//根据workflow id获取作业运行情况
WorkflowJob workflowJob = oozieClient.getJobInfo(jobid);
//获取作业日志
System.out.println(oozieClient.getJobLog(jobid));
//获取workflow中所有ACTION
List<WorkflowAction> list = workflowJob.getActions();
for (WorkflowAction action : list) {
//输出每个Action的 Appid 即Yarn的Application ID
System.out.println(action.getExternalId());
}
} catch (Exception e) {
e.printStackTrace();
}
}
}
4.总结
- 通过Oozie API提交作业,需要先定义好workflow.xml文件
- 参数传递通过在代码里面调用oozieClient.createConfiguration()创建一个- Properties对象将K,V值存储并传入oozieClient.run(properties)中
- 在指定Shell脚本时需要注意,shell脚本必须放在workflow.xml文件同级的lib目录下,并且在代码中不能指定完整的HDFS路径,只需要指定相对路径即可properties.put("exec", "lib/ooziejob.sh");
大数据视频推荐:
腾讯课堂
CSDN
大数据语音推荐:
企业级大数据技术应用
大数据机器学习案例之推荐系统
自然语言处理
大数据基础
人工智能:深度学习入门到精通
网友评论