cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Not able to connect with Salesforce, We need to read data from Salesforce

709986
New Contributor

Not able to connect with Salesforce, We need to read data from Salesforce, we are getting

NoClassDefFoundError: scala/Product$class

Code:

%scala

val sfDF = spark.

        read.

        format("com.springml.spark.salesforce").

        option("username", "sfdfusername").

        option("password", ""sfdfpassword")."). 

        option("soql", "select * from table").

        option("version", "37.0").

        load()

Added below libraries also to the cluster, but not sure why we still facing the issue:

com.springml:spark-salesforce_2.11:1.1.3

org.scala-lang:scala-library:2.13.10

1 REPLY 1

Anonymous
Not applicable

@Amar.Kasarโ€‹ :

The error you are getting,

NoClassDefFoundError: scala/Product$class

, suggests that the Scala classpath is not set up correctly. You can try the following steps to troubleshoot the issue:

  1. Check if the library com.springml:spark-salesforce_2.11:1.1.3 is correctly installed on your cluster. You can do this by going to the Clusters UI, selecting the cluster that you are using, and then clicking on "Libraries" on the left-hand menu. Check if the library is present and its status is "Installed".
  2. Check if the correct version of Scala is installed on your cluster. The library com.springml:spark-salesforce_2.11:1.1.3 is compiled for Scala 2.11, so you need to make sure that Scala 2.11 is installed on your cluster. You can check this by running the following command in a notebook cell:
%scala
println(scala.util.Properties.versionString)

This will print the version of Scala that is being used by the cluster.

3 . Check if there are any conflicting versions of Scala libraries. Sometimes, different libraries may require different versions of Scala, which can cause conflicts. You can check for conflicts by running the following command in a notebook cell:

%scala
println(this.getClass().getClassLoader().getResource("reference.conf"))

This will print the location of the configuration file used by Spark, which lists all the dependencies and their versions. Check if there are any conflicts between the versions of Scala libraries.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group