<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Ingestion Framework in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/ingestion-framework/m-p/133419#M49839</link>
    <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/176268"&gt;@fjrodriguez&lt;/a&gt;&amp;nbsp;your assumptions are correct.&lt;/P&gt;&lt;P&gt;1)&amp;nbsp;&lt;SPAN&gt;If you want to&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;query or write to Azure SQL DB directly from Databricks SQL&lt;/STRONG&gt;&lt;SPAN&gt;&amp;nbsp;(using Unity Catalog), you need to create an&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;External Connection&lt;/STRONG&gt;&lt;SPAN&gt;&amp;nbsp;in Unity Catalog and then define&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;External Tables&lt;/STRONG&gt;&lt;SPAN&gt;&amp;nbsp;that point to your Azure SQL DB tables.&amp;nbsp;This is&amp;nbsp;&lt;STRONG&gt;not strictly required unless you&amp;nbsp;&lt;/STRONG&gt;want to manage these tables in UC for lineage and governance features.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;2) UC supports Stored procedures but not same as T-SQL procedures and cannot either call them. re-write as sp or as a notebook sql tasks in your workflow.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;3) databricks workflows are sufficient to take over adf pipelines like orchestration, validation etc&lt;/SPAN&gt;&lt;/P&gt;</description>
    <pubDate>Wed, 01 Oct 2025 08:53:15 GMT</pubDate>
    <dc:creator>saurabh18cs</dc:creator>
    <dc:date>2025-10-01T08:53:15Z</dc:date>
    <item>
      <title>Ingestion Framework</title>
      <link>https://community.databricks.com/t5/data-engineering/ingestion-framework/m-p/133333#M49805</link>
      <description>&lt;P&gt;I would to like to update my ingestion framework that is orchestrated by ADF, running couples Databricks notebook and copying the data to DB afterwards. I want to rely everything on Databricks i though this could be the design:&lt;/P&gt;&lt;P&gt;Step 1. Expose target tables in Unity Catalog Create a UC External Connection to Azure SQL DB, create the table using this connection ? is this needed ?&lt;/P&gt;&lt;P&gt;Step 2. Rewrite Stored Procedure logic in UC as SP cannot be mounted as same we do for tables or views, suggest to rewrite it.&lt;/P&gt;&lt;P&gt;Step 3. Orchestrate in Databricks Workflows Create a workflow job with tasks: Notebook task → does validation / preprocessing if needed SQL task → executes CALL SP managed by UC Downstream tasks → e.g., write results to DB."&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The motivation is to leverage the use of Unity Catalog and Workflows and deprecated ADF in future, but i started this PoC as a pre step.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Is my assumption correct ? any recommendation?&lt;/P&gt;</description>
      <pubDate>Tue, 30 Sep 2025 09:02:34 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/ingestion-framework/m-p/133333#M49805</guid>
      <dc:creator>fjrodriguez</dc:creator>
      <dc:date>2025-09-30T09:02:34Z</dc:date>
    </item>
    <item>
      <title>Re: Ingestion Framework</title>
      <link>https://community.databricks.com/t5/data-engineering/ingestion-framework/m-p/133419#M49839</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/176268"&gt;@fjrodriguez&lt;/a&gt;&amp;nbsp;your assumptions are correct.&lt;/P&gt;&lt;P&gt;1)&amp;nbsp;&lt;SPAN&gt;If you want to&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;query or write to Azure SQL DB directly from Databricks SQL&lt;/STRONG&gt;&lt;SPAN&gt;&amp;nbsp;(using Unity Catalog), you need to create an&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;External Connection&lt;/STRONG&gt;&lt;SPAN&gt;&amp;nbsp;in Unity Catalog and then define&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;External Tables&lt;/STRONG&gt;&lt;SPAN&gt;&amp;nbsp;that point to your Azure SQL DB tables.&amp;nbsp;This is&amp;nbsp;&lt;STRONG&gt;not strictly required unless you&amp;nbsp;&lt;/STRONG&gt;want to manage these tables in UC for lineage and governance features.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;2) UC supports Stored procedures but not same as T-SQL procedures and cannot either call them. re-write as sp or as a notebook sql tasks in your workflow.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;3) databricks workflows are sufficient to take over adf pipelines like orchestration, validation etc&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 01 Oct 2025 08:53:15 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/ingestion-framework/m-p/133419#M49839</guid>
      <dc:creator>saurabh18cs</dc:creator>
      <dc:date>2025-10-01T08:53:15Z</dc:date>
    </item>
    <item>
      <title>Re: Ingestion Framework</title>
      <link>https://community.databricks.com/t5/data-engineering/ingestion-framework/m-p/135798#M50433</link>
      <description>&lt;P&gt;Hey &lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/22314"&gt;@saurabh18cs&lt;/a&gt;&amp;nbsp;,&amp;nbsp;&lt;/P&gt;&lt;P&gt;It is taking longer than expected to expose Azure SQL tables in UC. I can do that through Foreign Catalog but this is not what i want due to is read-only. As far i can see external connection is for&amp;nbsp;cloud object storage paths (ADLS/S3/GCS), isn´t it ?&lt;/P&gt;</description>
      <pubDate>Thu, 23 Oct 2025 07:04:30 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/ingestion-framework/m-p/135798#M50433</guid>
      <dc:creator>fjrodriguez</dc:creator>
      <dc:date>2025-10-23T07:04:30Z</dc:date>
    </item>
  </channel>
</rss>

