I would recommend the best ways to install the driver depending on your specific environment:
1. Recommended: Use Unity Catalog Volumes
For modern Databricks runtimes (13.3 LTS and above), storing JARs in a Unity Catalog Volume is the standard for secure, governed access.
Step 1: Create a volume and upload the ngdbc.jar file to it (you can do this via Catalog Explorer).
Step 2: When defining your UC Connection, include the path to this JAR in the java_dependencies parameter.
CREATE CONNECTION hana_connection TYPE JDBC
ENVIRONMENT (
java_dependencies '["/Volumes/your_catalog/your_schema/your_volume/ngdbc.jar"]'
)
OPTIONS (
url 'jdbc:sap://<hana_host>:<port>',
user '<user>',
password '<password>'
);
Step 3: Ensure the cluster or job has READ VOLUME permissions on that specific volume.
2. Install as a Cluster-Scoped Library
If you are running this from a standard Job or All-Purpose cluster, you can install the driver directly to the compute resource.
- Via Maven: This is the easiest way, as it handles dependencies automatically.
- Go to your cluster's Libraries tab > Install New > Maven.
- Enter the coordinates: com.sap.cloud.db.jdbc:ngdbc:2.25.9 (or your preferred version).
- Via JAR Upload:
- Go to Libraries > Install New > Upload > JAR.
- Upload your local ngdbc.jar.
3. Using Databricks Asset Bundles (DABs)
If you are deploying your ETL via bundles, you must explicitly list the library in your databricks.yml task definition.
resources:
jobs:
hana_etl_job:
tasks:
- task_key: extraction_task
libraries:
- maven:
coordinates: "com.sap.cloud.db.jdbc:ngdbc:2.25.9"
Hope this should work!
Thanks!
Om Singh