美文网首页
SparkException——Dynamic partitio

SparkException——Dynamic partitio

作者: juddar | 来源:发表于2019-08-10 16:50 被阅读0次

    问题描述:

    在spark-shell控制台运行如下命令:插入数据到hive分区表中

    scala> spark.sql("select usrid,age from w_a").map(x=>(x.getString(0),x.getString(1),getNowDate)).toDF("usrid", "age", "dt").write.mode(SaveMode.Append).insertInto("w_d")

    报错如下:

    org.apache.spark.SparkException: Dynamic partition strict mode requires at least one static partition column. To turn this off set hive.exec.dynamic.partition.mode=nonstrict 

    处理方式就是调整hive动态分区模式为非严格模式即可

    sqlContext.setConf("hive.exec.dynamic.partition.mode","nonstrict");

    2.0以后sqlContext被封装到sparkSession里了,所以要这么设置:

    spark.sqlContext.setConf("hive.exec.dynamic.partition.mode","nonstrict")

    相关文章

      网友评论

          本文标题:SparkException——Dynamic partitio

          本文链接:https://www.haomeiwen.com/subject/kvbrjctx.html