- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-09-2016 01:22 PM
I've got a table I want to add some data to and it's partitoned. I want to use dynamic partitioning but I get this error
org.apache.spark.SparkException: Dynamic partition strict mode requires at least one static partition column. To turn this off set hive.exec.dynamic.partition.mode=nonstrict
at org.apache.spark.sql.hive.execution.InsertIntoHiveTable.sideEffectResult$lzycompute(InsertIntoHiveTable.scala:168) at org.apache.spark.sql.hive.execution.InsertIntoHiveTable.sideEffectResult(InsertIntoHiveTable.scala:127) at org.apache.spark.sql.hive.execution.InsertIntoHiveTable.doExecute(InsertIntoHiveTable.scala:263)I've set
hive.exec.dynamic.partition.mode=nonstrict
to nonstrict and I've restarted hive in ambari. But when I re run the spark-shell job I still get the error?
Should I set it elsewhere, in the hive config?
here is the command
df2.write.mode("append").partitionBy("p_date", "p_store_id").saveAsTable("TLD.ticket_pa
yment_testinsert")df2 is a dataframe with a bunch of csv data read into it.
I've tried setting it in my spark-shell command
spark-shell --master yarn-client --packages com.databricks:spark-csv_2.11:1.4.0 --num-executors 4 --executor-cores 5 --executor-memory 8G --queue hadoop-capq --conf "hive.exec.dynamic.partition.mode=nonstrict"
but I get this warning
Warning: Ignoring non-spark config property: hive.exec.dynamic.partition.mode=nonstrict
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-10-2016 09:44 AM
Try this:
hiveContext.setConf("hive.exec.dynamic.partition", "true") hiveContext.setConf("hive.exec.dynamic.partition.mode", "nonstrict")- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-10-2016 09:44 AM
Try this:
hiveContext.setConf("hive.exec.dynamic.partition", "true") hiveContext.setConf("hive.exec.dynamic.partition.mode", "nonstrict")- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-04-2018 05:44 PM
我也遇到类似问题了,通过上面的方法解决了,谢谢@peyman !
class JavaSparkSessionSingletonUtil {
private static transient SparkSession instance = null;
public static SparkSession getInstance(String appName) {
SparkSession.clearDefaultSession();
if (instance == null) {
instance = SparkSession.builder().appName(appName)
.config("hive.exec.dynamic.partition", "true")
.config("hive.exec.dynamic.partition.mode", "nonstrict")
//.config("spark.sql.warehouse.dir", new File("spark-warehouse").getAbsolutePath())
// .config("spark.driver.allowMultipleContexts", "true")
.enableHiveSupport().getOrCreate();
}
return instance;
}
}
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-13-2016 03:53 PM
I got it working. This was exactly what I needed. Thank you @Peyman Mohajerian
![](/skins/images/97567C72181EBE789E1F0FD869E4C89B/responsive_peak/images/icon_anonymous_message.png)
![](/skins/images/97567C72181EBE789E1F0FD869E4C89B/responsive_peak/images/icon_anonymous_message.png)