Using spark jars using databricks-connect>=13.0
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-05-2023 04:25 AM
With the newest version of databricks-connect, I cannot configure the extra jars I want to use. In the older version, I did that via
spark = SparkSession.builder.appName('DataFrame').\ config('spark.jars.packages','org.apache.spark:spark-avro_2.12:3.3.0').getOrCreate()
How can I configure this with databricks-connect>=13.0?
1 REPLY 1
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-18-2024 12:21 AM
Hello, I have a question on this. I have a jar file which is installed on my Databricks cluster. Do you happen to know how can I access the spark session running in databricks within my code (which later turns into jar)
I have tried using SparkSession.active and SparkSession.getActiveSession but I get error that these methods are not supported.

