美文网首页
hadoop datanode启动报错Invalid HADOO

hadoop datanode启动报错Invalid HADOO

作者: codingbug | 来源:发表于2020-05-28 17:48 被阅读0次

在ambari中启动hadoop datanode时报错ERROR:Invalid HADOOP_COMMON_HOME

image-20200528143139232.png

resource_management.core.exceptions.ExecutionFailed: Execution of 'ambari-sudo.sh su ocdp -l -s /bin/bash -c 'ulimit -c unlimited ; /usr/hdp/3.1.5.1-2/hadoop/bin/hdfs --config /usr/hdp/3.1.5.1-2/hadoop/conf --daemon start datanode'' returned 1. ERROR: Cannot set priority of datanode process 12884

无效解决方法过程:

  • 通过检查脚本/usr/hdp/3.1.5.1-2/hadoop/bin/hdfs,该脚本中确实有配置一些环境变量,尝试向该脚本添加报错环境变量无效

  • 检查配置文件目录/usr/hdp/3.1.5.1-2/hadoop/conf,唯一可疑的是hadoop-env.sh

  • 手动向主机配置环境变量,编辑.bash_profile

    这个方法是有效的,但是得想办法找全项目中所有需要的环境变量,容易遗漏

有效解决方法:

通过搜索报错信息,尝试找到是报错逻辑

发现错误是从/usr/hdp/3.1.5.1-2/hadoop/libexec/hadoop-functions.sh 文件中报出来的

报错出处主要代码如下:

这个方法是初始化Hadoop Shell环境变量的

 829 ## @description  Initialize the Hadoop shell environment, now that
 830 ## @description  user settings have been imported
 831 ## @audience     private
 832 ## @stability    evolving
 833 ## @replaceable  no
 834 function hadoop_basic_init
 835 {
 836   # Some of these are also set in hadoop-env.sh.
 837   # we still set them here just in case hadoop-env.sh is
 838   # broken in some way, set up defaults, etc.
 839   #
 840   # but it is important to note that if you update these
 841   # you also need to update hadoop-env.sh as well!!!
 842 
 843   CLASSPATH=""
 844   hadoop_debug "Initialize CLASSPATH"
 845 
 846   if [[ -z "${HADOOP_COMMON_HOME}" ]] &&
 847   [[ -d "${HADOOP_HOME}/${HADOOP_COMMON_DIR}" ]]; then
 848     export HADOOP_COMMON_HOME="${HADOOP_HOME}"
 849   fi
 850 
 851   # default policy file for service-level authorization
 852   HADOOP_POLICYFILE=${HADOOP_POLICYFILE:-"hadoop-policy.xml"}
 853 
 854   # define HADOOP_HDFS_HOME
 855   if [[ -z "${HADOOP_HDFS_HOME}" ]] &&
 856      [[ -d "${HADOOP_HOME}/${HDFS_DIR}" ]]; then
 857     export HADOOP_HDFS_HOME="${HADOOP_HOME}"
 858   fi
 859 
 860   # define HADOOP_YARN_HOME
 861   if [[ -z "${HADOOP_YARN_HOME}" ]] &&
 862      [[ -d "${HADOOP_HOME}/${YARN_DIR}" ]]; then
 863     export HADOOP_YARN_HOME="${HADOOP_HOME}"
 864   fi
 865 
 866   # define HADOOP_MAPRED_HOME
 867   if [[ -z "${HADOOP_MAPRED_HOME}" ]] &&
 868      [[ -d "${HADOOP_HOME}/${MAPRED_DIR}" ]]; then
 869     export HADOOP_MAPRED_HOME="${HADOOP_HOME}"
 870   fi
 871 
 872   if [[ ! -d "${HADOOP_COMMON_HOME}" ]]; then
 873     hadoop_error "ERROR: Invalid HADOOP_COMMON_HOME"
 874     exit 1
 875   fi
 876 
 877   if [[ ! -d "${HADOOP_HDFS_HOME}" ]]; then
 878     hadoop_error "ERROR: Invalid HADOOP_HDFS_HOME"
 879     exit 1
 880   fi

应该有对变量赋值的地方,接下来找到这个方法hadoop_bootstrap

## @description  Bootstraps the Hadoop shell environment
## @audience     private
## @stability    evolving
## @replaceable  no
function hadoop_bootstrap
{
  # the root of the Hadoop installation
  # See HADOOP-6255 for the expected directory structure layout

  if [[ -n "${DEFAULT_LIBEXEC_DIR}" ]]; then
    hadoop_error "WARNING: DEFAULT_LIBEXEC_DIR ignored. It has been replaced by HADOOP_DEFAULT_LIBEXEC_DIR."
  fi

  # By now, HADOOP_LIBEXEC_DIR should have been defined upstream
  # We can piggyback off of that to figure out where the default
  # HADOOP_FREFIX should be.  This allows us to run without
  # HADOOP_HOME ever being defined by a human! As a consequence
  # HADOOP_LIBEXEC_DIR now becomes perhaps the single most powerful
  # env var within Hadoop.
  if [[ -z "${HADOOP_LIBEXEC_DIR}" ]]; then
    hadoop_error "HADOOP_LIBEXEC_DIR is not defined.  Exiting."
    exit 1
  fi
  HADOOP_DEFAULT_PREFIX=$(cd -P -- "${HADOOP_LIBEXEC_DIR}/.." >/dev/null && pwd -P)
  HADOOP_HOME=${HADOOP_HOME:-$HADOOP_DEFAULT_PREFIX}
  export HADOOP_HOME

  #
  # short-cuts. vendors may redefine these as well, preferably
  # in hadoop-layouts.sh   ## 提到这个文件
  #
  HADOOP_COMMON_DIR=${HADOOP_COMMON_DIR:-"share/hadoop/common"}
  HADOOP_COMMON_LIB_JARS_DIR=${HADOOP_COMMON_LIB_JARS_DIR:-"share/hadoop/common/lib"}
  HADOOP_COMMON_LIB_NATIVE_DIR=${HADOOP_COMMON_LIB_NATIVE_DIR:-"lib/native"}
  HDFS_DIR=${HDFS_DIR:-"share/hadoop/hdfs"}
  HDFS_LIB_JARS_DIR=${HDFS_LIB_JARS_DIR:-"share/hadoop/hdfs/lib"}
    ...

这个方法有提到hadoop-layouts.sh

然后去找这个目录发现没有这个文件,而在其它环境中找到了这个文件

image-20200528143259655.png

查看文件内容,正是我们环境变量的定义

vim hadoop-layout.sh

# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements.  See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License.  You may obtain a copy of the License at
#
#     http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

HADOOP_COMMON_DIR="./"
HADOOP_COMMON_LIB_JARS_DIR="lib"
HADOOP_COMMON_LIB_NATIVE_DIR="lib/native"
HDFS_DIR="./"
HDFS_LIB_JARS_DIR="lib"
YARN_DIR="./"
YARN_LIB_JARS_DIR="lib"
MAPRED_DIR="./"
MAPRED_LIB_JARS_DIR="lib"

HADOOP_LIBEXEC_DIR=/usr/hdp/3.1.5.1-2/hadoop/libexec
HADOOP_CONF_DIR=/usr/hdp/3.1.5.1-2/hadoop/conf
HADOOP_COMMON_HOME=/usr/hdp/3.1.5.1-2/hadoop
HADOOP_HDFS_HOME=/usr/hdp/3.1.5.1-2/hadoop-hdfs
HADOOP_MAPRED_HOME=/usr/hdp/3.1.5.1-2/hadoop-mapreduce
HADOOP_YARN_HOME=/usr/hdp/3.1.5.1-2/hadoop-yarn

添加该脚本后变量问题就正常了

可以通过以下命令查看环境变量

[root@host-xxx ~]# hdfs envvars
JAVA_HOME='/usr/jdk64/jdk1.8.0_112'
HADOOP_HDFS_HOME='/usr/hdp/3.1.5.1-2/hadoop-hdfs'
HDFS_DIR='./'
HDFS_LIB_JARS_DIR='lib'
HADOOP_CONF_DIR='/usr/hdp/3.1.5.1-2/hadoop/conf'
HADOOP_TOOLS_HOME='/usr/hdp/3.1.5.1-2/hadoop'
HADOOP_TOOLS_DIR='share/hadoop/tools'
HADOOP_TOOLS_LIB_JARS_DIR='share/hadoop/tools/lib'
[root@host-xxx ~]# hadoop envvars
JAVA_HOME='/usr/jdk64/jdk1.8.0_112'
HADOOP_COMMON_HOME='/usr/hdp/3.1.5.1-2/hadoop'
HADOOP_COMMON_DIR='./'
HADOOP_COMMON_LIB_JARS_DIR='lib'
HADOOP_COMMON_LIB_NATIVE_DIR='lib/native'
HADOOP_CONF_DIR='/usr/hdp/3.1.5.1-2/hadoop/conf'
HADOOP_TOOLS_HOME='/usr/hdp/3.1.5.1-2/hadoop'
HADOOP_TOOLS_DIR='share/hadoop/tools'
HADOOP_TOOLS_LIB_JARS_DIR='share/hadoop/tools/lib'

相关文章

网友评论

      本文标题:hadoop datanode启动报错Invalid HADOO

      本文链接:https://www.haomeiwen.com/subject/bfeaahtx.html