<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic How to install SAP JDBC on job cluster via asset bundles in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/how-to-install-sap-jdbc-on-job-cluster-via-asset-bundles/m-p/127423#M47961</link>
    <description>&lt;P&gt;I'm trying to use the SAP JDBC driver to read data in my Spark application which I deploy via asset bundles with job computes.&lt;/P&gt;&lt;P&gt;I was able to install the SAP JDBC Driver on a &lt;FONT color="#008000"&gt;&lt;STRONG&gt;general purpose&lt;/STRONG&gt;&amp;nbsp;&lt;STRONG&gt;cluster&lt;/STRONG&gt; &lt;/FONT&gt;by adding the jar (com.sap.cloud.db.jdbc:ngdbc:2.25.9) in the UI via Libraries -&amp;gt; Install new -&amp;gt; maven -&amp;gt; maven coordinates.&lt;BR /&gt;Then I could executed the code, no problem.&lt;/P&gt;&lt;P&gt;But when I try to add the jar on a&lt;FONT color="#FF0000"&gt;&amp;nbsp;&lt;STRONG&gt;job cluster&amp;nbsp;&lt;/STRONG&gt;&lt;/FONT&gt;in Databricks asset bundles, via spark config - like this:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;spark_conf:
      "spark.databricks.cluster.profile": "singleNode"
      "spark.master": "local[*]"
      "spark.sql.session.timeZone": "UTC"
      "spark.databricks.dataLineage.enabled": "true"
      "spark.databricks.delta.retentionDurationCheck.enabled": "false"
      "spark.sql.extensions": "io.delta.sql.DeltaSparkSessionExtension"
      "spark.sql.catalog.spark_catalog": "org.apache.spark.sql.delta.catalog.DeltaCatalog"
      "spark.jars.packages": "com.microsoft.azure:azure-storage:8.6.3,org.apache.hadoop:hadoop-azure:3.3.1,io.delta:delta-core_2.12:2.4.0,com.sap.cloud.db.jdbc:ngdbc:2.25.9"&lt;/LI-CODE&gt;&lt;P&gt;I cannot get it to work, my application fails with the erorr '&lt;SPAN class=""&gt;&lt;STRONG&gt;java.lang.ClassNotFoundException: com.sap.db.jdbc.Driver&lt;/STRONG&gt;' as if it is not installing it correctly. The other jars are available however.&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;In the Log4j output of the cluster, I can see the spark config being set correctly:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;spark.home=/databricks/spark
spark.jars.packages=com.microsoft.azure:azure-storage:8.6.3,org.apache.hadoop:hadoop-azure:3.3.1,io.delta:delta-core_2.12:2.4.0,com.sap.cloud.db.jdbc:ngdbc:2.25.9&lt;/LI-CODE&gt;&lt;P&gt;I do not see the sap jar being referenced anywhere else in the log output.&amp;nbsp;&lt;/P&gt;&lt;P&gt;What am I doing wrong / how can I fix it / furher debug the problem?&lt;/P&gt;</description>
    <pubDate>Tue, 05 Aug 2025 08:07:36 GMT</pubDate>
    <dc:creator>VicS</dc:creator>
    <dc:date>2025-08-05T08:07:36Z</dc:date>
    <item>
      <title>How to install SAP JDBC on job cluster via asset bundles</title>
      <link>https://community.databricks.com/t5/data-engineering/how-to-install-sap-jdbc-on-job-cluster-via-asset-bundles/m-p/127423#M47961</link>
      <description>&lt;P&gt;I'm trying to use the SAP JDBC driver to read data in my Spark application which I deploy via asset bundles with job computes.&lt;/P&gt;&lt;P&gt;I was able to install the SAP JDBC Driver on a &lt;FONT color="#008000"&gt;&lt;STRONG&gt;general purpose&lt;/STRONG&gt;&amp;nbsp;&lt;STRONG&gt;cluster&lt;/STRONG&gt; &lt;/FONT&gt;by adding the jar (com.sap.cloud.db.jdbc:ngdbc:2.25.9) in the UI via Libraries -&amp;gt; Install new -&amp;gt; maven -&amp;gt; maven coordinates.&lt;BR /&gt;Then I could executed the code, no problem.&lt;/P&gt;&lt;P&gt;But when I try to add the jar on a&lt;FONT color="#FF0000"&gt;&amp;nbsp;&lt;STRONG&gt;job cluster&amp;nbsp;&lt;/STRONG&gt;&lt;/FONT&gt;in Databricks asset bundles, via spark config - like this:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;spark_conf:
      "spark.databricks.cluster.profile": "singleNode"
      "spark.master": "local[*]"
      "spark.sql.session.timeZone": "UTC"
      "spark.databricks.dataLineage.enabled": "true"
      "spark.databricks.delta.retentionDurationCheck.enabled": "false"
      "spark.sql.extensions": "io.delta.sql.DeltaSparkSessionExtension"
      "spark.sql.catalog.spark_catalog": "org.apache.spark.sql.delta.catalog.DeltaCatalog"
      "spark.jars.packages": "com.microsoft.azure:azure-storage:8.6.3,org.apache.hadoop:hadoop-azure:3.3.1,io.delta:delta-core_2.12:2.4.0,com.sap.cloud.db.jdbc:ngdbc:2.25.9"&lt;/LI-CODE&gt;&lt;P&gt;I cannot get it to work, my application fails with the erorr '&lt;SPAN class=""&gt;&lt;STRONG&gt;java.lang.ClassNotFoundException: com.sap.db.jdbc.Driver&lt;/STRONG&gt;' as if it is not installing it correctly. The other jars are available however.&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;In the Log4j output of the cluster, I can see the spark config being set correctly:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;spark.home=/databricks/spark
spark.jars.packages=com.microsoft.azure:azure-storage:8.6.3,org.apache.hadoop:hadoop-azure:3.3.1,io.delta:delta-core_2.12:2.4.0,com.sap.cloud.db.jdbc:ngdbc:2.25.9&lt;/LI-CODE&gt;&lt;P&gt;I do not see the sap jar being referenced anywhere else in the log output.&amp;nbsp;&lt;/P&gt;&lt;P&gt;What am I doing wrong / how can I fix it / furher debug the problem?&lt;/P&gt;</description>
      <pubDate>Tue, 05 Aug 2025 08:07:36 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-to-install-sap-jdbc-on-job-cluster-via-asset-bundles/m-p/127423#M47961</guid>
      <dc:creator>VicS</dc:creator>
      <dc:date>2025-08-05T08:07:36Z</dc:date>
    </item>
    <item>
      <title>Re: How to install SAP JDBC on job cluster via asset bundles</title>
      <link>https://community.databricks.com/t5/data-engineering/how-to-install-sap-jdbc-on-job-cluster-via-asset-bundles/m-p/127425#M47963</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/132397"&gt;@VicS&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;To add a Maven package to a job task definition , in&amp;nbsp;libraries, specify a&amp;nbsp;maven&amp;nbsp;mapping for each Maven package to be installed. For each mapping, specify the following:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="python"&gt;resources:
  jobs:
    my_job:
      # ...
      tasks:
        - task_key: my_task
          # ...
          libraries:
            - maven:
                coordinates: com.databricks:databricks-sdk-java:0.8.1
            - maven:
                coordinates: com.databricks:databricks-dbutils-scala_2.13:0.1.4
                repo: https://mvnrepository.com/
                exclusions:
                  - org.scala-lang:scala-library:2.13.0-RC*&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;For more details refer to following documentation entry:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;A href="https://docs.databricks.com/aws/en/dev-tools/bundles/library-dependencies#maven-package" target="_blank" rel="noopener"&gt;Databricks Asset Bundles library dependencies | Databricks Documentation&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 05 Aug 2025 08:15:53 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-to-install-sap-jdbc-on-job-cluster-via-asset-bundles/m-p/127425#M47963</guid>
      <dc:creator>szymon_dybczak</dc:creator>
      <dc:date>2025-08-05T08:15:53Z</dc:date>
    </item>
  </channel>
</rss>

