<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Delta Jira data import to Databricks in Administration &amp; Architecture</title>
    <link>https://community.databricks.com/t5/administration-architecture/delta-jira-data-import-to-databricks/m-p/154394#M5137</link>
    <description>&lt;P&gt;We need to import large amount of Jira data into Databricks, and should import only the delta changes.&amp;nbsp; What's the best approach to do so?&amp;nbsp; Using the Fivetran Jira connector or develop our own Python scripts/pipeline code?&amp;nbsp; Thanks.&lt;/P&gt;</description>
    <pubDate>Mon, 13 Apr 2026 22:00:23 GMT</pubDate>
    <dc:creator>greengil</dc:creator>
    <dc:date>2026-04-13T22:00:23Z</dc:date>
    <item>
      <title>Delta Jira data import to Databricks</title>
      <link>https://community.databricks.com/t5/administration-architecture/delta-jira-data-import-to-databricks/m-p/154394#M5137</link>
      <description>&lt;P&gt;We need to import large amount of Jira data into Databricks, and should import only the delta changes.&amp;nbsp; What's the best approach to do so?&amp;nbsp; Using the Fivetran Jira connector or develop our own Python scripts/pipeline code?&amp;nbsp; Thanks.&lt;/P&gt;</description>
      <pubDate>Mon, 13 Apr 2026 22:00:23 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/delta-jira-data-import-to-databricks/m-p/154394#M5137</guid>
      <dc:creator>greengil</dc:creator>
      <dc:date>2026-04-13T22:00:23Z</dc:date>
    </item>
    <item>
      <title>Re: Delta Jira data import to Databricks</title>
      <link>https://community.databricks.com/t5/administration-architecture/delta-jira-data-import-to-databricks/m-p/154402#M5138</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/200255"&gt;@greengil&lt;/a&gt;,&lt;/P&gt;
&lt;P&gt;Have you considered Lakeflow Connect?&amp;nbsp; Databricks now has a native &lt;A href="https://docs.databricks.com/aws/en/ingestion/lakeflow-connect/jira" target="_blank"&gt;Jira connector&lt;/A&gt; in Lakeflow Connect that can achieve what you are looking for. It's in beta, but something you may want to consider.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;It ingests Jira into Delta with incremental (delta) loads out of the box, supports SCD1/SCD2, handles deletes via audit logs, and runs fully managed on serverless with Unity Catalog governance.&amp;nbsp;This is lower-effort and better integrated than both Fivetran and custom Python, and directly targets your large volume + only changes requirement.&lt;/P&gt;
&lt;DIV data-known-size="675" data-item-index="1" data-index="1"&gt;
&lt;DIV&gt;
&lt;DIV data-workflow-run-id="bf0a5f50520e46a19ade1ec99cf2d6c7"&gt;
&lt;DIV&gt;
&lt;DIV tabindex="0" data-ui-element="chat-assistant-content-bubble"&gt;
&lt;DIV&gt;
&lt;P&gt;If you can’t use the Databricks Jira connector, prefer Fivetran Jira --&amp;gt; Databricks over custom code for a managed, low-maintenance ELT path.&amp;nbsp;Only build custom Python pipelines if you have very specific requirements that neither managed option can meet.&lt;/P&gt;
&lt;P class="p1"&gt;&lt;FONT size="2" color="#FF6600"&gt;&lt;STRONG&gt;&lt;I&gt;If this answer resolves your question, could you mark it as “Accept as Solution”? That helps other users quickly find the correct fix.&lt;/I&gt;&lt;/STRONG&gt;&lt;/FONT&gt;&lt;/P&gt;
&lt;/DIV&gt;
&lt;/DIV&gt;
&lt;/DIV&gt;
&lt;/DIV&gt;
&lt;/DIV&gt;
&lt;/DIV&gt;</description>
      <pubDate>Tue, 14 Apr 2026 05:59:58 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/delta-jira-data-import-to-databricks/m-p/154402#M5138</guid>
      <dc:creator>Ashwin_DSA</dc:creator>
      <dc:date>2026-04-14T05:59:58Z</dc:date>
    </item>
  </channel>
</rss>

