Hi,
I want to initialize a Spark session using `DatabricksSession`. However, it seems not possible to call `.config()` and pass it a `SparkConf` instance. The following works:
# Initialize the configuration for the Spark session
confSettings = [
("spark.sql.legacy.timeParserPolicy", "CORRECTED"),
("spark.sql.mapKeyDedupPolicy", "LAST_WIN"),
("spark.sql.legacy.parquet.nanosAsLong", "true"),
]
conf = SparkConf() \
.setMaster("local") \
.setAll(confSettings) \
.setExecutorEnv(confSettings)
# Initialize a Spark session
spark = SparkSession.builder \
.config(conf=conf) \
.getOrCreate()
But the following throws an error:
```
# Initialize the configuration for the Spark session
confSettings = [
("spark.sql.legacy.timeParserPolicy", "CORRECTED"),
("spark.sql.mapKeyDedupPolicy", "LAST_WIN"),
("spark.sql.legacy.parquet.nanosAsLong", "true"),
]
conf = SparkConf() \
.setMaster("local") \
.setAll(confSettings) \
.setExecutorEnv(confSettings)
# Initialize a Spark session
spark = DatabricksSession.builder \
.profile("<profile-name>") \
.config(conf=conf) \
.getOrCreate()
```
Is there another way to set configuration for a Spark session when using `DatabricksSession`?
Thanks.