1. 环境
- 操作系统: Windows 7
- JDK版本: 1.8.0_221
- Hadoop版本: 2.6.0
2. 下载


3. 前期准备
3.1. 配置JAVA环境变量


4. 安装部署
4.1. 解压,提示报有错误发生不影响

4.2. 更改配置文件
- hadoop-env.cmd - 添加JAVA_HOME环境变量
排坑: Windows环境下,配置文件中Program Files中存在空格,会造成引用失败,使用PROGRA~1替换Program Files。
解决办法详见: Windows格式化namenode报错 - Error: JAVA_HOME is incorrectly set. Please update F:\hadoop\conf\hadoop-e...
set JAVA_HOME=C:\PROGRA~1\Java\jdk1.8.0_221
- .\etc\hadoop\core-site.xml
排坑: Windows下URI路径前要加/,例如:/E:/soft_work/hadoop-2.6.0/data/hdfs
解决办法详见: Windows格式化namenode报错 - ERROR namenode.NameNode: Failed to start namenode.
<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://127.0.0.1:8020</value>
</property>
<property>
<name>hadoop.tmp.dir</name>
<value>/E:/soft_work/hadoop-2.6.0/data/hdfs</value>
</property>
</configuration>
- .\etc\hadoop\hdfs-site.xml
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
</configuration>
4.3. 格式化namenode
.\bin\hdfs.cmd namenode -format
下面是成功信息,出现 has been successfully formatted.
2020-10-14 13:10:51,919 INFO common.Storage: Storage directory E:\soft_work\hado
op-2.6.0\data\hdfs\dfs\name has been successfully formatted.
2020-10-14 13:10:51,930 INFO namenode.FSImageFormatProtobuf: Saving image file E
:\soft_work\hadoop-2.6.0\data\hdfs\dfs\name\current\fsimage.ckpt_000000000000000
0000 using no compression
2020-10-14 13:10:52,043 INFO namenode.FSImageFormatProtobuf: Image file E:\soft_
work\hadoop-2.6.0\data\hdfs\dfs\name\current\fsimage.ckpt_0000000000000000000 of
size 389 bytes saved in 0 seconds.
2020-10-14 13:10:52,060 INFO namenode.NNStorageRetentionManager: Going to retain
1 images with txid >= 0
2020-10-14 13:10:52,066 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at jfp/172.18.16.29
************************************************************/
4.4. 启动HDFS
.\sbin\start-dfs.cmd
排坑: 启动hdfs时出现
找不到文件hadoop
解决办法详见: Windows启动HDFS报错 - 系统找不到文件 hadoop。
报错信息如下:
image.png
排坑: 启动报错
Could not locate Hadoop executable: E:\soft_work\hadoop-2.6.0\bin\winutils.exe
解决办法详见: Windows启动HDFS报错 - Could not locate Hadoop executable: E:\soft_work\hadoop-2.6.0\bin\winutils.exe
报错信息如下:Caused by: java.io.FileNotFoundException: Could not locate Hadoop executable: E:\soft_work\hadoop-2.6.0\bin\winutils.exe at org.apache.hadoop.util.Shell.getQualifiedBinInner(Shell.java:605) at org.apache.hadoop.util.Shell.getQualifiedBin(Shell.java:578) at org.apache.hadoop.util.Shell.<clinit>(Shell.java:675) at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:78) at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2871) at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2896)
排坑: 启动报错
org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
解决办法详见: Windows启动HDFS报错 - org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
报错信息如下:2020-10-14 13:33:09,920 ERROR namenode.NameNode: Failed to start namenode. java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method) at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:606) at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:971) at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.analyzeStorage(Storage.java:613) at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.analyzeStorage(Storage.java:573) at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverStorageDirs(FSImage.java:365) at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(FSImage.java:221) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSNamesystem.java:1072) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSNamesystem.java:704) at org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameNode.java:665) at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:727) at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:950) at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:929) at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1653) at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1720)
- 排坑之后 启动完成 namenode 和 datanode 均启动成功

4.5. 查看UI界面 启动成功

5. 拓展
5.1 通过jps查看 namenode 和 datanode进程
Windows环境需要特殊配置,详见: Windows中jps命令无法查看java进程问题
C:\Users\user>jps
5968 Jps
6776 NameNode
9816 DataNode
6. 安装启动HADOOP
如果想同时启动HDFS,Yarn,MapReduce,执行以下步骤:
6.1修改配置文件
- .\etc\hadoop\yarn-site.xml
<configuration>
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
<property>
<name>yarn.resourcemanager.hostname</name>
<value>localhost</value>
</property>
</configuration>
- .\etc\hadoop\mapred-site.xml
<configuration>
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
</configuration>
6.2 启动全部
.\sbin\start-all.cmd
6.3 查看启动进程
启动成功
C:\Users\user>jps
10096 NodeManager
10304 Jps
11240 NameNode
14856 ResourceManager
12892 DataNode
6.4 查看yarn Web-UI

网友评论