Using spark jars using databricks-connect>=13.0
With the newest version of databricks-connect, I cannot configure the extra jars I want to use. In the older version, I did that viaspark = SparkSession.builder.appName('DataFrame').\ config('spark.jars.packages','org.apache.spark:spark-avro_...
- 1016 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @Lazloo, In the newer versions of Databricks Connect, configuring additional JARs for your Spark session is still possible. Let’s adapt your previous approach to the latest version. Adding JARs to a Databricks cluster: If you want to add JAR f...
- 0 kudos