cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Runtime SQL Configuration - how to make it simple

Leszek
Contributor

Hi, I'm running couple of Notebooks in my pipeline and I would like to set fixed value of 'spark.sql.shuffle.partitions' - same value for every notebook. Should I do that by adding spark.conf.set.. code in each Notebook (Runtime SQL configurations are per-session) or is there any other, easier way to set this?

1 ACCEPTED SOLUTION

Accepted Solutions

Hubert-Dudek
Esteemed Contributor III

Easiest is to use spark config in advanced options in cluster settings:

spark-config-aws 

more info here https://docs.databricks.com/clusters/configure.html#spark-configuration

View solution in original post

5 REPLIES 5

Hubert-Dudek
Esteemed Contributor III

Easiest is to use spark config in advanced options in cluster settings:

spark-config-aws 

more info here https://docs.databricks.com/clusters/configure.html#spark-configuration

Prabakar
Esteemed Contributor III
Esteemed Contributor III

Setting it either in the notebook or the cluster, both should work. But the better option would be to go with the cluster's spark config.

jose_gonzalez
Moderator
Moderator

Hi @Leszek​ ,

Like @Hubert Dudek​ mentioned, I will recommend to add your setting to the cluster Spark configurations.

The difference between notebook vs cluster, is that:

  • In the Notebook, the Spark configurations will only apply to the notebook Spark context itself. The Spark configurations only will apply to the notebook
  • In the cluster, the Spark setting will be global and it will be apply to all the notebooks that are attach to the cluster.

Leszek
Contributor

Hi, Thank you all for the tips. I tried before to set this option in Spark Config but didn't work for some reason. Today I tried again and it's working :).

imageimage

Hubert-Dudek
Esteemed Contributor III

Great that it is working. Any chance to be selected as best answer? 🙂

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.