spark sql 配置
cp SPARK_HOME/conf
cp SPARK_HOME/conf
./bin/spark-shell --master yarn-client --jars /usr/local/src/apache-hive-0.13.0-bin/lib/mysql-connector-java-5.1.41-bin.jar
val df = spark.sql("select * from test_a limit 10")
看能不能取到hive的表的
spark sql 配置
cp SPARK_HOME/conf
cp SPARK_HOME/conf
./bin/spark-shell --master yarn-client --jars /usr/local/src/apache-hive-0.13.0-bin/lib/mysql-connector-java-5.1.41-bin.jar
val df = spark.sql("select * from test_a limit 10")
看能不能取到hive的表的
本文标题:spark sql 配置
本文链接:https://www.haomeiwen.com/subject/xadkfqtx.html
网友评论