cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

How to install (mssql) drivers to jobcompute?

PabloCSD
Valued Contributor

Hello, I'm having this issue with job-computes:

PabloCSD_0-1745527653462.png

The snippet of the code is as follows:

     84 if self.conf["persist_to_sql"]:
     85     # persist to sql
     86     df_parsed.write.format(
     87         "com.microsoft.sqlserver.jdbc.spark"
     88     ).option("url", self.get_sql_connection_string()).option(
     89         "dbtable", self.conf["sql_tables"]["report"]
     90     ).option(
     91         "driver", "com.microsoft.sqlserver.jdbc.SQLServerDriver"
     92     ).mode(
     93         "append"
---> 94     ).save()
     95     self.logger.info(
     96         f"Saved to {self.conf['sql_tables']['report']}"
     97     )

 How can I install the driver to this job-compute?

This is the driver I'm want to install: spark-mssql-connector_2.12-1.4.0-BETA.jar

I tried:

  • Upload to the workspace shared folder an init script that installs the driver (didn't work)
  • Indicate to the databricks.yml configuration to install the library at the cluster definition from maven

Thanks

5 REPLIES 5

PabloCSD
Valued Contributor

Up!

AIenthu
New Contributor III

To install the spark-mssql-connector_2.12-1.4.0-BETA.jar driver for Azure Databricks job clusters, use one of these methods:

1. Cluster-Level Installation via Maven

Add these Maven coordinates to your cluster/library configuration:

 
text
com.microsoft.azure:spark-mssql-connector_2.12:1.4.0-BETA com.microsoft.sqlserver:mssql-jdbc:8.4.1.jre8
  • Steps:

    1. Navigate to your Databricks cluster settings.

    2. Under Libraries, select Install New  Maven.

    3. Paste the coordinates above and install

2. Job-Specific Configuration

For job clusters, specify libraries directly in the job settings:

 
text
libraries: - maven: coordinates: "com.microsoft.azure:spark-mssql-connector_2.12:1.4.0-BETA" - maven: coordinates: "com.microsoft.sqlserver:mssql-jdbc:8.4.1.jre8"

This ensures dependencies are included when the job cluster starts

3. Init Script (Alternative)

If using an init script, ensure both the connector and JDBC driver are copied:

 
bash
#!/bin/bash cp /dbfs/FileStore/jars/spark-mssql-connector_2.12-1.4.0-BETA.jar /databricks/jars/ cp /dbfs/FileStore/jars/mssql-jdbc-8.4.1.jre8.jar /databricks/jars/
  • Upload the JARs to DBFS first

Key Considerations

  • Spark Version Compatibility: Confirm your Databricks runtime aligns with the connector’s Spark 3.4 requirement

  • JDBC Dependency: The mssql-jdbc driver is mandatory to resolve NoClassDefFoundError

  • Avoid Manual DBFS Uploads: Maven installation is preferred for dependency management

For Azure Data Factory pipelines, configure the libraries in the Databricks linked service’s Advanced settings under "Libraries"

NandiniN
Databricks Employee
Databricks Employee

For a job compute, you would have to go init script route.

Can you please highlight, the cause of the failure of library installation via init script? 

PabloCSD
Valued Contributor

Hello @NandiniN ,

In the Shared folder I created a init_script for the JobCompute defined as its shown:

 

#!/bin/bash
echo "=====> Installing MSSQL Spark Connector"
cp /dbfs/Workspace/Shared/drivers/spark-mssql-connector_2.12-1.4.0-BETA.jar /databricks/jars/
echo "=====> Connector copied!"

 

PabloCSD_0-1746453202163.png

In the logs there isn't any log and the workflow is still failing due to the missing driver, if you could give me any direction for trying to solve this would be much appreciated

PabloCSD
Valued Contributor

Up!

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now