集群方式运行
运行spark程序报如下错:
Did not find registered driver with class com.mysql.jdbc.Driver
解决办法:
- 首先创建默认配置文件
cp $SPARK_HOME/conf/spark-defaults.conf.template $SPARK_HOME/conf/spark-defaults.conf
-
在$SPARK_HOME/conf/spark-default.conf 中添加 spark.driver.extraClassPath 和 spark.executor.extraClassPath 这两个属性, 后面的值是jar的路径. 如果有多个路径使用":"分隔.例如:
# Example: # spark.master spark://master:7077 # spark.eventLog.enabled true # spark.eventLog.dir hdfs://namenode:8021/directory # spark.serializer org.apache.spark.serializer.KryoSerializer # spark.driver.memory 5g # spark.executor.extraJavaOptions -XX:+PrintGCDetails -Dkey=value -Dnumbers="one two three" spark.driver.extraClassPath /usr/share/java/mysql-connector-java-5.1.28.jar spark.executor.extraClassPath /usr/share/java/mysql-connector-java-5.1.28.jar
-
复制spark-defaults.conf文件到其他节点的$SPARK_HOME/conf目录下
scp $SPARK_HOME/conf/spark-defaults.conf hadoop@mini1:$SPARK_HOME/conf/spark-defaults.conf
-
重启集群
stop-all.sh
start-all.sh
本地方式运行
在Program arguments中添加参数
--jar E:\java\常用jar包\mysql-connector-java-5.0.8-bin.jar --driver-class-path E:\java\常用jar包\mysql-connector-java-5.0.8-bin.jar
如下图:
image.png
网友评论