cancel
Showing results forĀ 
Search instead forĀ 
Did you mean:Ā 
Community Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results forĀ 
Search instead forĀ 
Did you mean:Ā 

Databricks notebook issue

Hani4hanuman
New Contributor II

Hi,

I'm trying to run ADF pipeline.However, it is getting fail at Notebook activity with below error.

Error :NoSuchMethodError: com.microsoft.sqlserver.jdbc.SQLServerBulkCopy.writeToServer(Lcom/microsoft/sqlserver/jdbc/ISQLServerBulkRecord;)V 

I think it's related to libraries issue but not sure which one need to install to resolve the issue. 

Details: My Azure Databricks cluster runtime is 9.1LTS and scala 2.12 version.

Can you please help on this . I tried with installing with different libraries like below but not worked..

com.microsoft.azure:spark-mssql-connector_2.12:1.2.0

 

 

 

 

 

 

2 REPLIES 2

shan_chandra
Esteemed Contributor
Esteemed Contributor

@Hani4hanuman  - Kindly try with the latest connector - com.microsoft.azure:spark-mssql-connector_2.12:1.3.0 and use the latest DBR 12.2 LTS and above. 

If the issue still persists, kindly raise a support case with us. 

Hani4hanuman
New Contributor II

@shan_chandra   Thanks for your reply as per your suggetion changed Databricks version from 9.1LTS to 12.2LTS

Hani4hanuman_0-1691381108158.png

But after change this when i check library which you provided(i.e com.microsoft.azure:spark-mssql-connector_2.12:1.3.0) under Maven it is not available in the list,. only Beta version available even i used that also but its failing 

Hani4hanuman_1-1691381231966.png

 

Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!