cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Not able to connect with Salesforce, We need to read data from Salesforce

709986
New Contributor

Not able to connect with Salesforce, We need to read data from Salesforce, we are getting

NoClassDefFoundError: scala/Product$class

Code:

%scala

val sfDF = spark.

        read.

        format("com.springml.spark.salesforce").

        option("username", "sfdfusername").

        option("password", ""sfdfpassword")."). 

        option("soql", "select * from table").

        option("version", "37.0").

        load()

Added below libraries also to the cluster, but not sure why we still facing the issue:

com.springml:spark-salesforce_2.11:1.1.3

org.scala-lang:scala-library:2.13.10

1 REPLY 1

Anonymous
Not applicable

@Amar.Kasar​ :

The error you are getting,

NoClassDefFoundError: scala/Product$class

, suggests that the Scala classpath is not set up correctly. You can try the following steps to troubleshoot the issue:

  1. Check if the library com.springml:spark-salesforce_2.11:1.1.3 is correctly installed on your cluster. You can do this by going to the Clusters UI, selecting the cluster that you are using, and then clicking on "Libraries" on the left-hand menu. Check if the library is present and its status is "Installed".
  2. Check if the correct version of Scala is installed on your cluster. The library com.springml:spark-salesforce_2.11:1.1.3 is compiled for Scala 2.11, so you need to make sure that Scala 2.11 is installed on your cluster. You can check this by running the following command in a notebook cell:
%scala
println(scala.util.Properties.versionString)

This will print the version of Scala that is being used by the cluster.

3 . Check if there are any conflicting versions of Scala libraries. Sometimes, different libraries may require different versions of Scala, which can cause conflicts. You can check for conflicts by running the following command in a notebook cell:

%scala
println(this.getClass().getClassLoader().getResource("reference.conf"))

This will print the location of the configuration file used by Spark, which lists all the dependencies and their versions. Check if there are any conflicts between the versions of Scala libraries.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.