SparkException——Dynamic partition strict mode 问题解决

问题场景

在spark-shell控制台,运行testDF.write.mode("append").partitionBy("dt").saveAsTable("t_pgw_base_statistics_final_dy_test");,提示org.apache.spark.SparkException: Dynamic partition strict mode requires at least one static partition column. To turn this off set hive.exec.dynamic.partition.mode=nonstrict

解决办法

设置参数,就可以存入了。

sqlContext.setConf("hive.exec.dynamic.partition.mode","nonstrict");
发布了43 篇原创文章 · 获赞 4 · 访问量 2万+

猜你喜欢

转载自blog.csdn.net/u013084266/article/details/82260262