tpDF.repartition(tableEntity.settings.preWriteFileNum).write.mode(SaveMode.Append).parquet(tpPath)
sqlContext.sql(
s"ALTER TABLE ${table} ADD IF NOT EXISTS PARTITION (${partitionInfo}) LOCATION '${tpPath}'")
}
tpDF.repartition(tableEntity.settings.preWriteFileNum).write.mode(SaveMode.Append).parquet(tpPath)
sqlContext.sql(
s"ALTER TABLE ${table} ADD IF NOT EXISTS PARTITION (${partitionInfo}) LOCATION '${tpPath}'")
}
本文标题:spark将dataframe追加到hive外部分区表
本文链接:https://www.haomeiwen.com/subject/uncizqtx.html
网友评论