cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Cant view DAB deployed pipelines in Databricks UI

lezwon
Contributor
I am using the databricks asset pipeline to version control the jobs and pipelines in my workspace. I recently pulled these pipelines from the workspace using the `databricks bundle generate pipeline` command and deployed them back using `databricks bundle deploy`. I am using a service principal I created to deploy this bundle. The deployment was a success, but when I visited the UI, I only saw the created jobs as [dev service-principal-id] Job 1/2/3. The pipelines are missing. However, when I run `databricks pipelines list-pipelines`, I can see these pipelines. I'm not sure if I'm missing some permission. The Workflows using these pipelines show an error saying "Pipeline ID not found". The run also fails at this pipeline task, and the run page does not load, so I cannot debug. Can anyone help me get access to the newly created pipelines?
1 ACCEPTED SOLUTION

Accepted Solutions

Louis_Frolio
Databricks Employee
Databricks Employee

Hey @lezwon 

 

Thanks for the details and screenshots—this looks like a permissions/ownership issue with your newly deployed Delta Live Tables pipelines.
 
What’s going on Pipelines run under the pipeline owner’s identity (Databricks recommends a service principal). If your service principal created the pipelines, they’re owned by that principal and not visible to you unless you’re granted access.
  • Lakeflow Declarative Pipelines (DLT) are ACL‑protected securables with levels like CAN_VIEW, CAN_RUN, CAN_MANAGE, and IS_OWNER. Without at least CAN_VIEW on a pipeline, it won’t appear in the UI for you, and job configuration UIs can show “Pipeline ID not found”.
  • Your screenshots match this: the job’s pipeline task shows “Pipeline ID not found” and the run page won’t render pipeline output, while the CLI (authenticated as the service principal) can list the pipelines. This is consistent with missing ACLs for your user on the pipelines you deployed.
The fix: grant yourself (or your group) permissions on the pipelines You can do this two ways. The most repeatable option is to manage permissions in the bundle.
 
Option A — set pipeline permissions in your bundle and redeploy Add a permissions block for each pipeline (or at the target level) so your user or group gets access on deploy:
# databricks.yml (or included resource file)
resources:
  pipelines:
    dlt_pipeline:
      name: dlt_pipeline
      # … your existing pipeline settings …
      permissions:
        - user_name: louis.frolio@databricks.com
          level: CAN_MANAGE  # or CAN_RUN / CAN_VIEW

# Alternatively, at target level to apply across resources:
targets:
  dev:
    resources:
      pipelines:
        dlt_pipeline:
          permissions:
            - user_name: louis.frolio@databricks.com
              level: CAN_MANAGE
```
Redeploy with `databricks bundle deploy`. This approach is documented and supported for pipelines in Asset Bundles.

Notes:
* If you prefer to keep edit rights limited, use **CAN_VIEW** to see the pipeline and **CAN_RUN** to trigger updates; use **CAN_MANAGE** to also edit settings.

#### Option B — share from the UI
If you can log in as a workspace admin or the service principal owner, open Jobs & Pipelines → pipeline → Share, then add your user or group with the appropriate permission level and Save.

### Why the job task says “Pipeline ID not found”
* The job edit/run page uses your identity in the UI. If you don’t have CAN_VIEW on the pipeline, the UI can’t resolve the referenced pipeline ID and shows the “Pipeline ID not found” message even though the pipeline exists and the service principal can see it.

### Additional checks and tips
* Confirm the pipeline ownership and permissions (as the service principal or an admin), then grant your user or a group the needed level. Pipelines updates always run as the owner’s identity—keeping the owner as the service principal is the recommended pattern.

* Ensure the job’s pipeline task targets a **triggered** pipeline (not continuous). Pipeline tasks in Jobs only support triggered pipelines selected from the “Pipeline” dropdown.

* If non‑admins need to view driver logs for Unity Catalog‑enabled pipelines, add this to the pipeline config to allow logs for CAN_VIEW/CAN_RUN/CAN_MANAGE users:
```json
{
  "configuration": {
    "spark.databricks.acl.needAdminPermissionToViewLogs": "false"
  }
}
  • If you can’t find the “Share” option in the pipeline UI at all, your workspace may have legacy access control toggles disabled—an admin might need to enable the “Clusters, Pools, Jobs Access Control” setting to manage ACLs on pipelines.
  •  
Summary of next steps * Add a permissions block for the pipelines in your bundle to grant your user the right level (CAN_VIEW or CAN_RUN, optionally CAN_MANAGE), then redeploy.
  • Alternatively, have an admin or the service principal owner share the pipelines with you via Jobs & Pipelines → pipeline → Share.
  • Re-open the job page; the pipeline reference should resolve, and runs should proceed unless there’s a separate configuration issue (compute, mode, permissions to data, etc.). If driver logs are still restricted, apply the log ACL config noted above.
 
Cheers, Louis.

View solution in original post

1 REPLY 1

Louis_Frolio
Databricks Employee
Databricks Employee

Hey @lezwon 

 

Thanks for the details and screenshots—this looks like a permissions/ownership issue with your newly deployed Delta Live Tables pipelines.
 
What’s going on Pipelines run under the pipeline owner’s identity (Databricks recommends a service principal). If your service principal created the pipelines, they’re owned by that principal and not visible to you unless you’re granted access.
  • Lakeflow Declarative Pipelines (DLT) are ACL‑protected securables with levels like CAN_VIEW, CAN_RUN, CAN_MANAGE, and IS_OWNER. Without at least CAN_VIEW on a pipeline, it won’t appear in the UI for you, and job configuration UIs can show “Pipeline ID not found”.
  • Your screenshots match this: the job’s pipeline task shows “Pipeline ID not found” and the run page won’t render pipeline output, while the CLI (authenticated as the service principal) can list the pipelines. This is consistent with missing ACLs for your user on the pipelines you deployed.
The fix: grant yourself (or your group) permissions on the pipelines You can do this two ways. The most repeatable option is to manage permissions in the bundle.
 
Option A — set pipeline permissions in your bundle and redeploy Add a permissions block for each pipeline (or at the target level) so your user or group gets access on deploy:
# databricks.yml (or included resource file)
resources:
  pipelines:
    dlt_pipeline:
      name: dlt_pipeline
      # … your existing pipeline settings …
      permissions:
        - user_name: louis.frolio@databricks.com
          level: CAN_MANAGE  # or CAN_RUN / CAN_VIEW

# Alternatively, at target level to apply across resources:
targets:
  dev:
    resources:
      pipelines:
        dlt_pipeline:
          permissions:
            - user_name: louis.frolio@databricks.com
              level: CAN_MANAGE
```
Redeploy with `databricks bundle deploy`. This approach is documented and supported for pipelines in Asset Bundles.

Notes:
* If you prefer to keep edit rights limited, use **CAN_VIEW** to see the pipeline and **CAN_RUN** to trigger updates; use **CAN_MANAGE** to also edit settings.

#### Option B — share from the UI
If you can log in as a workspace admin or the service principal owner, open Jobs & Pipelines → pipeline → Share, then add your user or group with the appropriate permission level and Save.

### Why the job task says “Pipeline ID not found”
* The job edit/run page uses your identity in the UI. If you don’t have CAN_VIEW on the pipeline, the UI can’t resolve the referenced pipeline ID and shows the “Pipeline ID not found” message even though the pipeline exists and the service principal can see it.

### Additional checks and tips
* Confirm the pipeline ownership and permissions (as the service principal or an admin), then grant your user or a group the needed level. Pipelines updates always run as the owner’s identity—keeping the owner as the service principal is the recommended pattern.

* Ensure the job’s pipeline task targets a **triggered** pipeline (not continuous). Pipeline tasks in Jobs only support triggered pipelines selected from the “Pipeline” dropdown.

* If non‑admins need to view driver logs for Unity Catalog‑enabled pipelines, add this to the pipeline config to allow logs for CAN_VIEW/CAN_RUN/CAN_MANAGE users:
```json
{
  "configuration": {
    "spark.databricks.acl.needAdminPermissionToViewLogs": "false"
  }
}
  • If you can’t find the “Share” option in the pipeline UI at all, your workspace may have legacy access control toggles disabled—an admin might need to enable the “Clusters, Pools, Jobs Access Control” setting to manage ACLs on pipelines.
  •  
Summary of next steps * Add a permissions block for the pipelines in your bundle to grant your user the right level (CAN_VIEW or CAN_RUN, optionally CAN_MANAGE), then redeploy.
  • Alternatively, have an admin or the service principal owner share the pipelines with you via Jobs & Pipelines → pipeline → Share.
  • Re-open the job page; the pipeline reference should resolve, and runs should proceed unless there’s a separate configuration issue (compute, mode, permissions to data, etc.). If driver logs are still restricted, apply the log ACL config noted above.
 
Cheers, Louis.