cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Spark Sql Connector :

Rahul_Samant
Contributor

i am trying to read data from azure sql database from databricks. azure sql database is created with private link endpoint.Using DBR 10.4 LTS Cluster and expectation is the connector is pre installed as per documentation.

using the below code to fetch but getting java.lang.ClassNotFoundException:

df = spark.read \

    .format("com.microsoft.sqlserver.jdbc.spark") \

    .option("url", "xyz-server.database.windows.net") \

    .option("dbtable", "test") \

    .option("databaseName","test") \

    .option("user", 'test') \

    .option("password", 'abc') \

    .load()

Any help is appreciated

1 ACCEPTED SOLUTION

Accepted Solutions

artsheiko
Valued Contributor II
Valued Contributor II

Hi,

My point was just to verify that you use a version compatible to DBR.

No, it's not preinstalled, you need to import it using maven coordinate. You can see here how proceed to install a it.

The list of installed libraries coul be find in a DBR 10.4 LTS release notes.

View solution in original post

4 REPLIES 4

artsheiko
Valued Contributor II
Valued Contributor II

It seems that .option("databaseName", "test") is redundant here as you need to include the db name in the url.

Please verify that you use a connector compatible to your cluster's Spark version : Apache Spark connector: SQL Server & Azure SQL

Hi @Artem Sheiko​ ,

i read its preinstalled in DBR Runtime and we don't have to worry about it ?

btw as per the link you shared . connector is only available for 3.1.x an DBR 10.4 LTS is 3.2.X. so that's why its not working ?

ConnectorMaven CoordinateSpark 2.4.x compatible connector

com.microsoft.azure:spark-mssql-connector:1.0.2

Spark 3.0.x compatible connector

com.microsoft.azure:spark-mssql-connector_2.12:1.1.0

Spark 3.1.x compatible connector

com.microsoft.azure:spark-mssql-connector_2.12:1.2.0

artsheiko
Valued Contributor II
Valued Contributor II

Hi,

My point was just to verify that you use a version compatible to DBR.

No, it's not preinstalled, you need to import it using maven coordinate. You can see here how proceed to install a it.

The list of installed libraries coul be find in a DBR 10.4 LTS release notes.

i downgraded to 9.1 LTS and installed below . working now

Spark 3.1.x compatible connector

com.microsoft.azure:spark-mssql-connector_2.12:1.2.0

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.