1.运行hadoop命令时报metrics.MetricsUtil: Unable to obtain hostName错误
20/03/29 09:55:09 INFO Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id
20/03/29 09:55:09 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
20/03/29 09:55:34 INFO metrics.MetricsUtil: Unable to obtain hostName
java.net.UnknownHostException: wzy.biggie01: wzy.biggie01: Name or service not known
at java.net.InetAddress.getLocalHost(InetAddress.java:1505)
at org.apache.hadoop.metrics.MetricsUtil.getHostName(MetricsUtil.java:95)
at org.apache.hadoop.metrics.MetricsUtil.createRecord(MetricsUtil.java:84)
at org.apache.hadoop.metrics.jvm.JvmMetrics.<init>(JvmMetrics.java:87)
at org.apache.hadoop.metrics.jvm.JvmMetrics.init(JvmMetrics.java:78)
at org.apache.hadoop.metrics.jvm.JvmMetrics.init(JvmMetrics.java:65)
at org.apache.hadoop.mapred.LocalJobRunnerMetrics.<init>(LocalJobRunnerMetrics.java:40)
at org.apache.hadoop.mapred.LocalJobRunner.<init>(LocalJobRunner.java:714)
at org.apache.hadoop.mapred.LocalJobRunner.<init>(LocalJobRunner.java:707)
at org.apache.hadoop.mapred.LocalClientProtocolProvider.create(LocalClientProtocolProvider.java:42)
at org.apache.hadoop.mapreduce.Cluster.initialize(Cluster.java:95)
at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:82)
at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:75)
at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1260)
如上所示,最初搭建好hadoop环境之后运行命令报错,提示无法获取hostname,原因是因为没有修改/etc/hosts
文件使其中的映射与主机名一致。
解决方案:
查看Linux主机名(这里我的主机名为wzy.biggie01,您的主机名会不同):
$ hostname
wzy.biggie01
然后修改/etc/hosts
文件,将主机名加入到127.0.0.1的映射中,修改此配置文件需要root权限(也可以新添加一行ip映射)。
# vi /etc/hosts
修改后的文件:
127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4
::1 localhost localhost.localdomain localhost6 localhost6.localdomain6
192.168.1.102 wzy.biggie01
网友评论