美文网首页我爱编程Hadoop
Hadoop第一次部署后启动的linux命令翻译

Hadoop第一次部署后启动的linux命令翻译

作者: 风荥 | 来源:发表于2018-01-27 12:37 被阅读0次

    [hadoop@DataWorks hadoop]$ ./sbin/start-all.sh

    This script is deprecated. Instead Use start-dfs.sh and start-yarn.sh

    #这个脚本反对用来代替start-dfs.sh和start-yarn.sh这两个文件

    #(表提示,没有直接禁止)

    18/01/27 02:29:47 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

    #18/01/27 02:29:47 警告:将使用匹配的内置java类为你的原生Hadoop平台加载,因为util.nativecodeloader类无法使用其下载功能

    Starting namenodes on [DataWorks.Master]

    #启动DataWorks.Master机器上的namenodes

    DataWorks.Master: starting namenode, logging to /home/hadoop/hadoop/logs/hadoop-hadoop-namenode-DataWorks.Master.out

    #DataWorks.Master:开始启动namenode,并随时记录写入/home/hadoop/hadoop/logs/hadoop-hadoop-namenode-DataWorks.Master.out文件.

    The authenticity of host 'dataworks.node1 (192.168.2.11)' can't be established.

    #dataworks.node1机器的实际主机地址 (192.168.2.11)无法建立连接

    RSA key fingerprint is d0:4f:6f:26:cb:d6:4b:2d:cf:5e:d2:df:84:a1:b3:da.

    #RSA秘钥是d0:4f:6f:26:cb:d6:4b:2d:cf:5e:d2:df:84:a1:b3:da

    Are you sure you want to continue connecting (yes/no)? The authenticity of host 'localhost (::1)' can't be established.

    #您确定要继续连接(是/否)吗?localhost的机器的实际主机地址(::1)”无法建立连接.

    RSA key fingerprint is d0:4f:6f:26:cb:d6:4b:2d:cf:5e:d2:df:84:a1:b3:da.

    #RSA秘钥是d0:4f:6f:26:cb:d6:4b:2d:cf:5e:d2:df:84:a1:b3:da.

    Are you sure you want to continue connecting (yes/no)? The authenticity of host 'dataworks.node2 (192.168.2.12)' can't be established.

    #您确定要继续连接(是/否)吗?dataworks.node2 这台机器的实际主机地址dataworks.node2 (192.168.2.12)”无法建立连接.

    RSA key fingerprint is d0:4f:6f:26:cb:d6:4b:2d:cf:5e:d2:df:84:a1:b3:da.

    #RSA秘钥是d0:4f:6f:26:cb:d6:4b:2d:cf:5e:d2:df:84:a1:b3:da. 

    Are you sure you want to continue connecting (yes/no)? yes 

    #您确定要继续连接(是/否)吗?是的         

    DataWorks.Node1: Warning: Permanently added 'dataworks.node1,192.168.2.11' (RSA) to the list of known hosts.

    #dataworks.node1:警告:将永久添加dataworks.node1,192.168.2.11机器的RSA秘钥到已知的hosts列表中.

    DataWorks.Node1: Connection closed by 192.168.2.11

    #DataWorks.Node1: 与ip:192.168.2.11的连接被关闭

    localhost: Host key verification failed.

    #本地主机:主机密钥验证失败。

    DataWorks.Node2: Host key verification failed.

    #DataWorks.Node2: Host key verification failed.

    Starting secondary namenodes [DataWorks.Master]

    #启动DataWorks.Master机器的第二namenode节点

    DataWorks.Master: starting secondarynamenode, logging to /home/hadoop/hadoop/logs/hadoop-hadoop-secondarynamenode-DataWorks.Master.out

    DataWorks.Master:启动DataWorks.Master机器的第二namenode节点,并随时记录写入/home/hadoop/hadoop/logs/hadoop-hadoop-secondarynamenode-DataWorks.Master.out文件.

    18/01/27 03:45:55 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

    #18/01/27 03:45:55 警告:将使用匹配的内置java类为你的原生Hadoop平台加载,因为util.nativecodeloader类无法使用其下载功能

    starting yarn daemons

    #启动 YARN 守护进程

    starting resourcemanager, logging to /home/hadoop/hadoop/logs/yarn-hadoop-resourcemanager-DataWorks.Node2.out

    #启动resourcemanager(资源管理者)文件,并随时记录写入/home/hadoop/hadoop/logs/yarn-hadoop-resourcemanager-DataWorks.Node2.out文件.

    The authenticity of host 'localhost (::1)' can't be established.

    #localhost机器的实际主机地址 (::1)无法建立连接

    RSA key fingerprint is d0:4f:6f:26:cb:d6:4b:2d:cf:5e:d2:df:84:a1:b3:da.

    Are you sure you want to continue connecting (yes/no)? The authenticity of host 'dataworks.node2 (192.168.2.12)' can't be established.

    RSA key fingerprint is d0:4f:6f:26:cb:d6:4b:2d:cf:5e:d2:df:84:a1:b3:da.

    Are you sure you want to continue connecting (yes/no)? hadoop@dataworks.node1's password:

    Are you sure you want to continue connecting (yes/no)? hadoop@dataworks.node1's password:

    Please type 'yes' or 'no':

    Please type 'yes' or 'no':

    DataWorks.Node1: Connection closed by 192.168.2.11

    localhost: Host key verification failed.

    #本地主机:主机密钥验证失败。

    DataWorks.Node2: Host key verification failed.

    You have new mail in /var/spool/mail/root

    #你有了新信息在/var/spool/mail/root目录中

    [hadoop@DataWorks hadoop]$


    第一次启动Hadoop 难免有配置错误的各种情况,特地翻译了一下整个linux命令行的语句,记录一下.也确实发现了很多问题.

    相关文章

      网友评论

        本文标题:Hadoop第一次部署后启动的linux命令翻译

        本文链接:https://www.haomeiwen.com/subject/ajlvfxtx.html