Hi community,
I wanted to understand if there is a way to pass config values to spark session in runtime than using databricks-connect configure to run spark code.
One way I found out is given here: https://stackoverflow.com/questions/63088121/configuring-databricks-connect-using-python-os-module
The other way was running a code like: SparkSession.builder.appName('NewSpark').getOrCreate(), and then exporting spark conf creds, i.e:
spark.conf.set("spark.databricks.service.token", "<token>")
spark.conf.set("spark.databricks.service.address", "<address"), etc.
But using above approach gives me error: Caused by: java.lang.RuntimeException: Config file /home/ec2user/.databricks-connect not found. Please run `databricks-connect configure` to accept the end user license agreement and configure Databricks Connect.
Can I have a case where the .databricks config file is not created/populated, but via spark conf code we are able to run spark code?