cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Run spark code in notebook by setting spark conf instead of databricks connect configure in runtime

Surajv
New Contributor III

Hi community, 

I wanted to understand if there is a way to pass config values to spark session in runtime than using databricks-connect configure to run spark code. 

One way I found out is given here: https://stackoverflow.com/questions/63088121/configuring-databricks-connect-using-python-os-module

The other way was running a code like: SparkSession.builder.appName('NewSpark').getOrCreate(), and then exporting spark conf creds, i.e: 
spark.conf.set("spark.databricks.service.token", "<token>")
spark.conf.set("spark.databricks.service.address", "<address"), etc. 

But using above approach gives me error: Caused by: java.lang.RuntimeException: Config file /home/ec2user/.databricks-connect not found. Please run `databricks-connect configure` to accept the end user license agreement and configure Databricks Connect.

Can I have a case where the .databricks config file is not created/populated, but via spark conf code we are able to run spark code? 

1 REPLY 1

Kaniz_Fatma
Community Manager
Community Manager

Hi @Surajv, To pass configuration values to a Spark session at runtime in PySpark without relying on the Databricks Connect configuration, you can access and set Spark configuration parameters programmatically. First, retrieve the current Spark context settings using `spark.sparkContext.getConf().getAll()`. Then, set custom configuration parameters using `spark.conf.set("key", "value")` within your Spark application. Ensure that your Spark code sets necessary parameters and handles missing `.databricks-connect` configuration files to avoid errors. Remember to stop the Spark session (`spark.stop()`) when done.

Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!