1.提交Spark任务异常:spark.sql.AnalysisException: Table or view not found: user_visit_action
主方法:
SparkConf conf = new SparkConf()
.setAppName(Constants.SPARK_APP_NAME_PAGE);
JavaSparkContext sc = new JavaSparkContext(conf);
SparkUtils.setMaster(conf);
SparkSession sparkSession = SparkSession.builder()
.getOrCreate();
提交脚本:
$SPARK_HOME/bin/spark-submit \
--class com.micro.bigdata.page.PageConverRate \
--num-executors 1 \
--driver-memory 2000m \
--executor-memory 2000m \
--executor-cores 2 \
--files /usr/local/src/apache-hive-3.1.0-bin/conf/hive-site.xml \
/usr/local/spark_project/spark-project-1.0-SNAPSHOT-jar-with-dependencies.jar \
${1}
执行脚本:
./spark_page.sh 3
异常信息:
解决办法:
加上spark.sql.warehouse.dir的配置和enableHiveSupport,如下:
SparkConf conf = new SparkConf()
.setAppName(Constants.SPARK_APP_NAME_PAGE);
JavaSparkContext sc = new JavaSparkContext(conf);
SparkUtils.setMaster(conf);
SparkSession sparkSession = SparkSession.builder()
.config("spark.sql.warehouse.dir","hdfs://master:9000/hive/warehouse")
.enableHiveSupport()
.getOrCreate();
网友评论