<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Databricks Apps with Pyodbc Microsoft SQL Driver in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/databricks-apps-with-pyodbc-microsoft-sql-driver/m-p/125206#M47368</link>
    <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/28848"&gt;@bcodernet&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;Each Databricks app can include dependencies for Python, Node.js, or both. You define these dependencies in language-specific files:&lt;/SPAN&gt;&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;Use a&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;requirements.txt&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;file to specify additional&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;Python&lt;/STRONG&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;packages.&lt;/LI&gt;&lt;LI&gt;Use a&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;package.json&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;file to specify&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;Node.js&lt;/STRONG&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;packages.&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;&lt;A href="https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/dependencies" target="_blank" rel="noopener"&gt;Manage dependencies for a Databricks app - Azure Databricks | Microsoft Learn&lt;/A&gt;&lt;/P&gt;</description>
    <pubDate>Mon, 14 Jul 2025 20:21:51 GMT</pubDate>
    <dc:creator>szymon_dybczak</dc:creator>
    <dc:date>2025-07-14T20:21:51Z</dc:date>
    <item>
      <title>Databricks Apps with Pyodbc Microsoft SQL Driver</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-apps-with-pyodbc-microsoft-sql-driver/m-p/125203#M47367</link>
      <description>&lt;P&gt;I'm building an app that interfaces with an Azure SQL Database. I need to use Entra auth with a service principal, which is why I'm using the Microsoft ODBC driver. However, this works fine on my local, but I can't figure out how to get the ODBC drivers installed on the app cluster. I can achieve this through an init script on normal clusters, but not sure how to accomplish this on app clusters.&lt;/P&gt;</description>
      <pubDate>Mon, 14 Jul 2025 19:54:19 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-apps-with-pyodbc-microsoft-sql-driver/m-p/125203#M47367</guid>
      <dc:creator>bcodernet</dc:creator>
      <dc:date>2025-07-14T19:54:19Z</dc:date>
    </item>
    <item>
      <title>Re: Databricks Apps with Pyodbc Microsoft SQL Driver</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-apps-with-pyodbc-microsoft-sql-driver/m-p/125206#M47368</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/28848"&gt;@bcodernet&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;Each Databricks app can include dependencies for Python, Node.js, or both. You define these dependencies in language-specific files:&lt;/SPAN&gt;&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;Use a&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;requirements.txt&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;file to specify additional&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;Python&lt;/STRONG&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;packages.&lt;/LI&gt;&lt;LI&gt;Use a&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;package.json&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;file to specify&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;Node.js&lt;/STRONG&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;packages.&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;&lt;A href="https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/dependencies" target="_blank" rel="noopener"&gt;Manage dependencies for a Databricks app - Azure Databricks | Microsoft Learn&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 14 Jul 2025 20:21:51 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-apps-with-pyodbc-microsoft-sql-driver/m-p/125206#M47368</guid>
      <dc:creator>szymon_dybczak</dc:creator>
      <dc:date>2025-07-14T20:21:51Z</dc:date>
    </item>
    <item>
      <title>Re: Databricks Apps with Pyodbc Microsoft SQL Driver</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-apps-with-pyodbc-microsoft-sql-driver/m-p/125207#M47369</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/110502"&gt;@szymon_dybczak&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;I'm fully aware of the requirements.txt to install libraries. I have used it to install the pyodbc dependency. However, you still have to download and install Microsoft ODBC drivers and that can't be done through the requirements.txt route. On normal clusters, you have to use an init script to run a few shell commands to get this to work.&lt;/P&gt;</description>
      <pubDate>Mon, 14 Jul 2025 20:27:03 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-apps-with-pyodbc-microsoft-sql-driver/m-p/125207#M47369</guid>
      <dc:creator>bcodernet</dc:creator>
      <dc:date>2025-07-14T20:27:03Z</dc:date>
    </item>
    <item>
      <title>Re: Databricks Apps with Pyodbc Microsoft SQL Driver</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-apps-with-pyodbc-microsoft-sql-driver/m-p/125210#M47371</link>
      <description>&lt;P&gt;Sorry, you're right. But the thing is that serverless compute does not support JAR file installatioQn, so you cannot use JDBC or ODBC drivers.&lt;BR /&gt;But with Databricks Apps you have ability to trigger&amp;nbsp;&lt;SPAN&gt;Lakeflow Jobs. So you can create a job that will use classic compute and on that compute install ODBC driver. So the ingestion part of your app will be handled by classic compute job and then you can operate on result within an app using serverless compute&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="szymon_dybczak_0-1752527231635.png" style="width: 400px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/18183i1D192E7E73D6FF14/image-size/medium?v=v2&amp;amp;px=400" role="button" title="szymon_dybczak_0-1752527231635.png" alt="szymon_dybczak_0-1752527231635.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;&lt;A href="https://learn.microsoft.com/en-us/azure/databricks/compute/serverless/best-practices#ingesting-data-from-external-systems" target="_blank" rel="noopener"&gt;Best practices for serverless compute - Azure Databricks | Microsoft Learn&lt;/A&gt;&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 14 Jul 2025 21:10:22 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-apps-with-pyodbc-microsoft-sql-driver/m-p/125210#M47371</guid>
      <dc:creator>szymon_dybczak</dc:creator>
      <dc:date>2025-07-14T21:10:22Z</dc:date>
    </item>
  </channel>
</rss>

