美文网首页
90905-centos7-flink1.9.0-hadoop2

90905-centos7-flink1.9.0-hadoop2

作者: _backtrack_ | 来源:发表于2019-09-25 10:52 被阅读0次

90905-centos7-flink1.9.0-hadoop2.7.3-源码编译

编译环境

类型 版本 备注
OS centos el7.x86_64
maven 3.6.1
node v10.14.1
flink 源码 release-1.9.0
hadoop 2.7.3.2.6.0.3-8
ambari-hdp 2.6.0.3-8
scala 2.12

编译过程(环境构建参考下文)

  1. flink-release-1.9.0 源码
cd ~
# 打开下面链接下载最新的 1.9.0 源码 (不要下载 release 的 zip 包)
https://codeload.github.com/apache/flink/zip/release-1.9.0
unzip flink-release-1.9.0.zip
  1. 源码预处理
# 进入源码目录
cd flink-release-1.9.0
# pom 中 移除 test 相关模块
vim pom.xml
# 移除模块如下
<module>flink-tests</module>
<module>flink-test-utils-parent</module>
<module>flink-end-to-end-tests</module>
<module>flink-yarn-tests</module>
<module>flink-fs-tests</module>
<module>flink-docs</module>
# 修改pom 文件
cd flink-filesystems/flink-s3-fs-hadoop 
cd flink-filesystems/flink-oss-fs-hadoop 
vim pom.xml
# 删除 flink-fs-hadoop-shaded 下
  <scope>test</scope>
  <type>test-jar</type> 
# vim flink-connectors/flink-hadoop-compatibility/pom.xml
<dependency>
    <groupId>commons-cli</groupId>
    <artifactId>commons-cli</artifactId>
    <version>1.4</version>
</dependency>
# vim flink-connectors/pom.xml
<dependency>
    <groupId>commons-net</groupId>
    <artifactId>commons-net</artifactId>
    <version>3.6</version>
</dependency>   
# nodejs 相关处理
cd flink-runtime-web/web-dashboard
npm ci --cache-max=0 --no-save
rm -rf node_modules/caniuse-lite
rm -rf node_modules/browserslist
npm update
  1. 编译
cd flink-release-1.9.0
mvn clean install -Dmaven.test.skip=true  -Dcheckstyle.skip=true -Dlicense.skip=true -Drat.skip=true  -Drat.ignoreErrors=true  -Dfast -Dscala-2.12 -Dhadoop.version=2.7.3
# 如果编译错误会提示 mvn <goals> -rf :flink-hadoop-fs (...) ,这样写继续编译
mvn clean install -Dmaven.test.skip=true  -Dcheckstyle.skip=true -Dlicense.skip=true -Drat.skip=true  -Drat.ignoreErrors=true  -Dfast -Dscala-2.12 -Dhadoop.version=2.7.3 -rf :flink-hadoop-fs
# 编译成功后显示如下结果
[INFO] flink-metrics-graphite ............................. SUCCESS [  0.239 s]
[INFO] flink-metrics-influxdb ............................. SUCCESS [  1.130 s]
[INFO] flink-metrics-prometheus ........................... SUCCESS [  0.666 s]
[INFO] flink-metrics-statsd ............................... SUCCESS [  0.264 s]
[INFO] flink-metrics-datadog .............................. SUCCESS [  0.420 s]
[INFO] flink-metrics-slf4j ................................ SUCCESS [  0.247 s]
[INFO] flink-python ....................................... SUCCESS [  0.969 s]
[INFO] flink-dist ......................................... SUCCESS [ 19.897 s]
[INFO] flink-yarn-tests ................................... SUCCESS [  1.174 s]
[INFO] flink-ml-parent .................................... SUCCESS [  0.068 s]
[INFO] flink-ml-api ....................................... SUCCESS [  0.413 s]
[INFO] flink-ml-lib ....................................... SUCCESS [  0.390 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  11:25 min
[INFO] Finished at: 2019-09-05T20:28:00+08:00
[INFO] ------------------------------------------------------------------------
  1. 遇到问题及解决方案
# 通过 mvn 命令手动添加到本地仓库 下载 flink-shaded-hadoop-2-2.7.5-7.0.jar 等包
# 注意 不要放在 flink 源码工程中执行
  • 4.1 Could not find artifact org.apache.flink:flink-shaded-hadoop-2:jar:2.7.3-7.0
flink-hadoop-fs
# 去 maven 中央仓库搜索 flink-shaded-hadoop-2 , 发现没有我们的版本,下载版本最近的 jar 
<dependency>
    <groupId>org.apache.flink</groupId>
    <artifactId>flink-shaded-hadoop-2</artifactId>
    <version>2.7.5-7.0</version>
    <scope>provided</scope>
</dependency>
mvn install:install-file -DgroupId=org.apache.flink -DartifactId=flink-shaded-hadoop-2 -Dversion=2.7.3-7.0 -Dpackaging=jar  -Dfile=./flink-shaded-hadoop-2-2.7.5-7.0.jar
  • 4.2 Could not find artifact org.apache.flink:flink-fs-hadoop-shaded:jar:tests:1.9.0
    flink-s3-fs-hadoop
# 详细错误信息
 Failed to execute goal on project flink-s3-fs-hadoop: Could not resolve dependencies for project org.apache.flink:flink-s3-fs-hadoop:jar:1.9.0: Failure to find org.apache.flink:flink-fs-hadoop-shaded:jar:tests:1.9.0 in http://maven.aliyun.com/nexus/content/groups/public was cached in the local repository, resolution will not be reattempted until the update interval of alimaven has elapsed or updates are forced
#  pom 文件修改
cd flink-filesystems/flink-s3-fs-hadoop 
cd flink-filesystems/flink-oss-fs-hadoop 
vim pom.xml
# 删除 flink-fs-hadoop-shaded 下
  <scope>test</scope>
  <type>test-jar</type>                 
  • 4.3 Could not find artifact org.apache.flink:flink-shaded-hadoop-2-uber:jar:2.7.3-7.0
 flink-yarn-tests
 # 去 maven 中央仓库搜索 flink-shaded-hadoop-2 , 发现没有我们的版本,下载版本最近的 jar 
 <dependency>
     <groupId>org.apache.flink</groupId>
     <artifactId>flink-shaded-hadoop-2-uber</artifactId>
     <version>2.7.5-7.0</version>
     <scope>provided</scope>
 </dependency>
 mvn install:install-file -DgroupId=org.apache.flink -DartifactId=flink-shaded-hadoop-2-uber -Dversion=2.7.3-7.0 -Dpackaging=jar  -Dfile=./flink-shaded-hadoop-2-uber-2.7.5-7.0.jar

5 编译完成后 二进制 文件路径如下

cd flink-release-1.9.0  
cd flink-dist/target/flink-1.9.0-bin/flink-1.9.0

环境构建

maven

  1. 下载安装包
# 进入自己的下载路径
cd ~
# 下载安装包
wget http://mirror.olnevhost.net/pub/apache/maven/maven-3/3.6.1/binaries/apache-maven-3.6.1-bin.zip
  1. 环境配置
# 解压
unzip http://mirror.olnevhost.net/pub/apache/maven/maven-3/3.6.1/binaries/apache-maven-3.6.1-bin.zip
# 重命名文件夹
mv apache-maven-3.6.1 maven
# 配置环境变量
sudo vim /etc/profile
# 添加如下内容
export MAVEN_HOME=~/maven
export PATH=$MAVEN_HOME/bin:$PATH
# 刷新环境变量
source /etc/profile
  1. 配置 maven 阿里源
 # 编辑配置文件
vim ~/maven/conf/settings.xml
# mirrors 节点加入如下配置项
    <mirror>
      <id>alimaven</id>
      <name>aliyun maven</name>
      <url>http://maven.aliyun.com/nexus/content/groups/public</url>
      <mirrorOf>central</mirrorOf>
    </mirror>

node

  1. 测试 maven
# 输入  
mvn 
# 看到如下内容说明环境配置正常
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  0.098 s
[INFO] Finished at: 2019-09-05T11:48:27+08:00
[INFO] ------------------------------------------------------------------------

node 环境安装

  1. 环境安装
cd ~
wget https://npm.taobao.org/mirrors/node/v10.14.1/node-v10.14.1-linux-x64.tar.gz
tar zxvf node-v10.14.1-linux-x64.tar.gz
mv node-v10.14.1-linux-x64 node
ln -s ~/node/bin/node /usr/local/bin/node   
ln -s ~/node/bin/npm /usr/local/bin/npm
#
npm install -g cnpm --registry=https://registry.npm.taobao.org
#
alias cnpm="npm --registry=https://registry.npm.taobao.org \
--cache=$HOME/.npm/.cache/cnpm \
--disturl=https://npm.taobao.org/dist \
--userconfig=$HOME/.cnpmrc"
# 处理 npm 权限
npm config -g set unsafe-perm
  1. 测试
# 输入 
npm 
# 显示如下 则环境正常
Usage: npm <command>
where <command> is one of:

相关文章

网友评论

      本文标题:90905-centos7-flink1.9.0-hadoop2

      本文链接:https://www.haomeiwen.com/subject/ilkdyctx.html