According to the documentation, this is the default value. https://github.com/GoogleCloudDataproc/spark-bigquery-connector/blob/master/README.mdand I just tried it and it didn't work.
I'm using DBR 16.3 and all partitions are still being deleted. This is the code I'm using. No success. spark = (
SparkSession.builder.config("spark.datasource.bigquery.intermediateFormat", "orc")
.config("spark.sql.sources.partitionO...