cancel
Showing results for 
Search instead for 
Did you mean: 
Warehousing & Analytics
Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Community. Share insights, tips, and best practices for leveraging data for informed decision-making.
cancel
Showing results for 
Search instead for 
Did you mean: 

Lakeflow Connect for Jira - Questions

greengil
Contributor

So we got the Lakeflow Connect set up with Jira working and so far so good.  But when running the ETL pipeline to inject Jira data into Databricks, there's one table (isssue_with_deletes) fails due to the following error (partially shown).  We have followed the instructions on this page: https://docs.databricks.com/aws/en/ingestion/lakeflow-connect/jira-source-setup.

org.apache.spark.sql.streaming.StreamingQueryException: [STREAM_FAILED] Query [id = xxxxxxxxxxxxxxxx, runId = xxxxxxxxxxxx] terminated with exception: Job aborted due to stage failure: Task 0 in stage 146.0 failed 4 times, most recent failure: Lost task 0.3 in stage 146.0 (TID 149) (10.0.15.32 executor 0): com.databricks.pipelines.execution.conduit.common.DataConnectorException: [JIRA_ADMIN_PERMISSION_MISSING] Error encountered while calling Jira APIs. Source API type: sourceApi.jira.fetchAuditLogs. Ensure the connecting user has Jira admin permissions for your Jira instance.

Looking at this page: https://docs.databricks.com/aws/en/ingestion/lakeflow-connect/jira-reference#supported-jira-source-t...it mentions the user needs global admin permissions in Jira.

When setting up the connection in Catalog, our Databricks admin sets it up with his Databrick admin account.  His  Databricks account has no admin privileges in Jira.  In the setup process, when clicking on 'Sign into Jira" button, we entered the actual Jira admin credentials to log into Jira.  My understanding is, this is a one-time connection establishment between Jira and Databricks.  The actual required permissions are set in the Jira Oauth app scope in the Oauth app.  Reading the above page, it seems like the Databricks admin account used to set up the connection also needs Jira admin for the data injection to work.  Is that the expectation?  If so, why then we need to set the Oauth app scope?  Thanks.

0 REPLIES 0