I think it used to be possible to set shuffle partitions in databricks sql warehouse through e.g.: SET spark.sql.shuffle.partitions=20000. However, when I run this now, I get the error:
[CONFIG_NOT_AVAILABLE] Configuration spark.sql.shuffle.partitions is not available. SQLSTATE: 42K0I
Has the ability to set shuffle partitions been removed as part of an update, or is there an alternative way of doing this?
I have several queries that would hugely benefit from a larger number of shuffle partitions, as I can see massive amounts of spill when checking the Spark UI.