Azure Databricks: How to add Spark configuration in Databricks cluster?
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-24-2021 08:27 AM
I am using a Spark Databricks cluster and want to add a customized Spark configuration.
There is a Databricks documentation on this but I am not getting any clue how and what changes I should make. Can someone pls share the example to configure the Databricks cluster.
Is there any way to see the default configuration for Spark in the Databricks cluster.
Labels:
1 REPLY 1
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-24-2021 04:59 PM
You can set the configurations on the Databricks cluster UI
https://docs.databricks.com/clusters/configure.html#spark-configuration
To see the default configuration, run the below code in a notebook:
%sql
set;

