美文网首页
Spark组件部署-单节点

Spark组件部署-单节点

作者: CoderInsight | 来源:发表于2022-08-09 09:41 被阅读0次

一、[[Hadoop平台搭建-单节点-伪分布式|构建Hadoop基础平台]]

二、Spark on yarn

当前页面等价于配置单击Spark,前提是安装配置Scala,并配置环境变量

1.预先解压安装Scala,解压Spark

2.拷贝jar包

# 拷贝spark目录中yarn目录下的jar包到==> hadoop的yarn目录下
源目录:/usr/local/spark/yarn/spark-2.4.4-yarn-shuffle.jar
目的目录:/usr/local/hadoop/share/hadoop/yarn

3.修改spark-env.sh

export SPARK_MASTER_IP=master
export HADOOP_CONF_DIR=/usr/local/hadoop/etc/hadoop
export JAVA_HOME=/usr/local/jdk

4.报错总结

参考资源 :balance_scale:Spark集群遇到的问题

ERROR client.TransportClient: Failed to send RPC 6600979308376699964 to /192.168.56.103:56283: java.nio.channels.ClosedChannelException
java.nio.channels.ClosedChannelException

ERROR spark.SparkContext: Error initializing SparkContext.
java.lang.IllegalStateException: Spark context stopped while waiting for backend

ERROR cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: Sending RequestExecutors(0,0,Map()) to AM was unsuccessful
java.io.IOException: Failed to send RPC 6600979308376699964 to /192.168.56.103:56283: java.nio.channels.ClosedChannelException

Caused by: java.nio.channels.ClosedChannelException

Exception in thread "main" java.lang.IllegalStateException: Spark context stopped while waiting for backend

ERROR util.Utils: Uncaught exception in thread Yarn application state monitor
org.apache.spark.SparkException: Exception thrown in awaitResult

Caused by: java.io.IOException: Failed to send RPC 6600979308376699964 to /192.168.56.103:56283: java.nio.channels.ClosedChannelException

Caused by: java.nio.channels.ClosedChannelException
<!--这个问题是由于内存不足导致的。在yarn-site.xml中添加一下信息 -->

<property>
    <name>yarn.nodemanager.pmem-check-enabled</name>
    <value>false</value>
</property>

<property>
    <name>yarn.nodemanager.vmem-check-enabled</name>
    <value>false</value>
</property>

相关文章

网友评论

      本文标题:Spark组件部署-单节点

      本文链接:https://www.haomeiwen.com/subject/ukjywrtx.html