Data bricks -connect error
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-07-2025 04:52 AM
Hello,
I new to Databricks and Scala. I created a scala application in my local machine and tried to connect to my cluster in databricks workspace using databricks connect as per the documentation. My cluster is using Databricks runtime version 16.0 (includes Apache Spark 3.5.0, Scala 2.12).
I have added the below dependencies in my build.sbt :
Exception in thread "sbt-bg-threads-7" java.lang.NoSuchMethodError: 'org.apache.spark.sql.SparkSession$Builder org.apache.spark.sql.SparkSession$Builder.client(org.apache.spark.sql.connect.client.SparkConnectClient)'
at com.databricks.connect.DatabricksSession$Builder.fromSparkClientConf(DatabricksSession.scala:522)
at com.databricks.connect.DatabricksSession$Builder.fromSdkConfig(DatabricksSession.scala:515)
at com.databricks.connect.DatabricksSession$Builder.getOrCreate(DatabricksSession.scala:446)
at leakagetest.Main$.createSparkSession(Main.scala:41)
at leakagetest.Main$.delayedEndpoint$leakagetest$Main$1(Main.scala:27)
at leakagetest.Main$delayedInit$body.apply(Main.scala:18)
at scala.Function0.apply$mcV$sp(Function0.scala:39)
at scala.Function0.apply$mcV$sp$(Function0.scala:39)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:17)
at scala.App.$anonfun$main$1$adapted(App.scala:80)
at scala.collection.immutable.List.foreach(List.scala:431)
at scala.App.main(App.scala:80)
at scala.App.main$(App.scala:78)
at leakagetest.Main$.main(Main.scala:18)
at leakagetest.Main.main(Main.scala)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:568)
at sbt.Run.invokeMain(Run.scala:144)
at sbt.Run.execute$1(Run.scala:94)
at sbt.Run.$anonfun$runWithLoader$5(Run.scala:121)
at sbt.Run$.executeSuccess(Run.scala:187)
at sbt.Run.runWithLoader(Run.scala:121)
at sbt.Defaults$.$anonfun$bgRunTask$6(Defaults.scala:1988)
at sbt.Defaults$.$anonfun$termWrapper$2(Defaults.scala:1927)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at scala.util.Try$.apply(Try.scala:213)
at sbt.internal.BackgroundThreadPool$BackgroundRunnable.run(DefaultBackgroundJobService.scala:367)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
at java.base/java.lang.Thread.run(Thread.java:840)
Could some one please help me to solve this error.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-07-2025 05:36 AM - edited 01-07-2025 05:36 AM
The error you are encountering, java.lang.NoSuchMethodError: 'org.apache.spark.sql.SparkSession$Builder org.apache.spark.sql.SparkSession$Builder.client(org.apache.spark.sql.connect.client.SparkConnectClient)'
, suggests that there is a mismatch between the versions of the libraries you are using.
Check Library Versions: Ensure that the versions of spark-core
and databricks-connect
you are using are compatible with each other. According to the context, the databricks-connect
package version must match the Databricks Runtime version. For example, if you are using Databricks Runtime 12.2 LTS, you should use databricks-connect==12.2.*
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-07-2025 05:49 AM
I am using Databricks runtime version 16.0.0, therefore the databricks-connect ==16.0.0 and I changed the spark-core to 3.5.0 as per documentation. Still I am getting the same error.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-07-2025 06:10 AM
If you try with lower DBR does it work or same exact issue?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-13-2025 06:37 AM
Got the same error with lower DBR.
I am using OAuth user-to-machine (U2M) authentication and a configuration profile named DEFAULT which contains host, clusterId and auth_type saved in .databrickscfg file. I logged in using databricks cli before running the code.
To create SparkSession I used,
val spark = DatabricksSession.builder().remote().getOrCreate()
On debugging I could see that the, cluster_id ,host ,sdkConfig,token is None.
I followed this tutorial, https://docs.databricks.com/en/dev-tools/databricks-connect/scala/index.html#tutorial
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-13-2025 06:51 AM
so the issue is happening on Step 4, is this the only workspace you have synced in the CLI, or you have done works with any other workspace as well
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-13-2025 11:46 AM
This is the only workspace that I have synced.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-13-2025 11:57 AM
Can you try creating another profile instead of the Default one and try with it, it seems that what it is not collecting is the cluster details but wanted to check with a new profile
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-13-2025 11:41 PM
I tried creating another profile and used the below code:
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-14-2025 01:50 AM - edited 01-14-2025 01:59 AM
try this with parameters once:
Run the following command to configure Databricks Connect to use the .databrickscfg file inside CLI:

