<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Lakeflow Connect for Jira - Questions in Warehousing &amp; Analytics</title>
    <link>https://community.databricks.com/t5/warehousing-analytics/lakeflow-connect-for-jira-questions/m-p/157038#M2578</link>
    <description>&lt;P&gt;So we got the Lakeflow Connect set up with Jira working and so far so good.&amp;nbsp; But when running the ETL pipeline to inject Jira data into Databricks, there's one table (isssue_with_deletes) fails due to the following error (partially shown).&amp;nbsp; We have followed the instructions on this page:&amp;nbsp;&lt;A href="https://docs.databricks.com/aws/en/ingestion/lakeflow-connect/jira-source-setup" rel="nofollow noopener noreferrer" target="_blank"&gt;https://docs.databricks.com/aws/en/ingestion/lakeflow-connect/jira-source-setup&lt;/A&gt;.&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;org.apache.spark.sql.streaming.StreamingQueryException: [STREAM_FAILED] Query [id = xxxxxxxxxxxxxxxx, runId = xxxxxxxxxxxx] terminated with exception: Job aborted due to stage failure: Task 0 in stage 146.0 failed 4 times, most recent failure: Lost task 0.3 in stage 146.0 (TID 149) (10.0.15.32 executor 0): com.databricks.pipelines.execution.conduit.common.DataConnectorException: [JIRA_ADMIN_PERMISSION_MISSING] Error encountered while calling Jira APIs. Source API type: sourceApi.jira.fetchAuditLogs. Ensure the connecting user has Jira admin permissions for your Jira instance.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;Looking at this page:&amp;nbsp;&lt;A href="https://docs.databricks.com/aws/en/ingestion/lakeflow-connect/jira-reference#supported-jira-source-tables," rel="nofollow noopener noreferrer" target="_blank"&gt;https://docs.databricks.com/aws/en/ingestion/lakeflow-connect/jira-reference#supported-jira-source-t...&lt;/A&gt;it mentions the user needs global admin permissions in Jira.&lt;/P&gt;&lt;P&gt;When setting up the connection in Catalog, our Databricks admin sets it up with his Databrick admin account.&amp;nbsp; His&amp;nbsp; Databricks account has no admin privileges in Jira.&amp;nbsp; In the setup process, when clicking on 'Sign into Jira" button, we entered the actual Jira admin credentials to log into Jira.&amp;nbsp; My understanding is, this is a one-time connection establishment between Jira and Databricks.&amp;nbsp; The actual required permissions are set in the Jira Oauth app scope in the Oauth app.&amp;nbsp; Reading the above page, it seems like the Databricks admin account used to set up the connection also needs Jira admin for the data injection to work.&amp;nbsp; Is that the expectation?&amp;nbsp; If so, why then we need to set the Oauth app scope?&amp;nbsp; Thanks.&lt;/P&gt;</description>
    <pubDate>Fri, 15 May 2026 20:12:43 GMT</pubDate>
    <dc:creator>greengil</dc:creator>
    <dc:date>2026-05-15T20:12:43Z</dc:date>
    <item>
      <title>Lakeflow Connect for Jira - Questions</title>
      <link>https://community.databricks.com/t5/warehousing-analytics/lakeflow-connect-for-jira-questions/m-p/157038#M2578</link>
      <description>&lt;P&gt;So we got the Lakeflow Connect set up with Jira working and so far so good.&amp;nbsp; But when running the ETL pipeline to inject Jira data into Databricks, there's one table (isssue_with_deletes) fails due to the following error (partially shown).&amp;nbsp; We have followed the instructions on this page:&amp;nbsp;&lt;A href="https://docs.databricks.com/aws/en/ingestion/lakeflow-connect/jira-source-setup" rel="nofollow noopener noreferrer" target="_blank"&gt;https://docs.databricks.com/aws/en/ingestion/lakeflow-connect/jira-source-setup&lt;/A&gt;.&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;org.apache.spark.sql.streaming.StreamingQueryException: [STREAM_FAILED] Query [id = xxxxxxxxxxxxxxxx, runId = xxxxxxxxxxxx] terminated with exception: Job aborted due to stage failure: Task 0 in stage 146.0 failed 4 times, most recent failure: Lost task 0.3 in stage 146.0 (TID 149) (10.0.15.32 executor 0): com.databricks.pipelines.execution.conduit.common.DataConnectorException: [JIRA_ADMIN_PERMISSION_MISSING] Error encountered while calling Jira APIs. Source API type: sourceApi.jira.fetchAuditLogs. Ensure the connecting user has Jira admin permissions for your Jira instance.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;Looking at this page:&amp;nbsp;&lt;A href="https://docs.databricks.com/aws/en/ingestion/lakeflow-connect/jira-reference#supported-jira-source-tables," rel="nofollow noopener noreferrer" target="_blank"&gt;https://docs.databricks.com/aws/en/ingestion/lakeflow-connect/jira-reference#supported-jira-source-t...&lt;/A&gt;it mentions the user needs global admin permissions in Jira.&lt;/P&gt;&lt;P&gt;When setting up the connection in Catalog, our Databricks admin sets it up with his Databrick admin account.&amp;nbsp; His&amp;nbsp; Databricks account has no admin privileges in Jira.&amp;nbsp; In the setup process, when clicking on 'Sign into Jira" button, we entered the actual Jira admin credentials to log into Jira.&amp;nbsp; My understanding is, this is a one-time connection establishment between Jira and Databricks.&amp;nbsp; The actual required permissions are set in the Jira Oauth app scope in the Oauth app.&amp;nbsp; Reading the above page, it seems like the Databricks admin account used to set up the connection also needs Jira admin for the data injection to work.&amp;nbsp; Is that the expectation?&amp;nbsp; If so, why then we need to set the Oauth app scope?&amp;nbsp; Thanks.&lt;/P&gt;</description>
      <pubDate>Fri, 15 May 2026 20:12:43 GMT</pubDate>
      <guid>https://community.databricks.com/t5/warehousing-analytics/lakeflow-connect-for-jira-questions/m-p/157038#M2578</guid>
      <dc:creator>greengil</dc:creator>
      <dc:date>2026-05-15T20:12:43Z</dc:date>
    </item>
  </channel>
</rss>

