cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

daabricks workflow design

Anish_2
New Contributor III

Hello Team,

I have use-case in which i want to trigger another dlt pipeline if 1 table got succeded in my parent dlt pipeline. I dont want to create pipeline to pipeline dependency. Is there any way to create table to pipeline dependency?

Thank you

Anish

2 ACCEPTED SOLUTIONS

Accepted Solutions

MoJaMa
Databricks Employee
Databricks Employee

You can try Table Update Trigger (TUT).

So your pipeline2 will get triggered when there is an "update" to some chosen tables(s) updated in your pipeline1.

https://docs.databricks.com/aws/en/jobs/trigger-table-update

View solution in original post

Raman_Unifeye
Honored Contributor III

@Anish_2 - TUT is the solution. in TUT, instead of the parent pipeline "pushing" a notification, the child job is "pulled" into action by a metadata change.

Set it up as below.

  1. Create a Databricks Job and add a Pipeline task pointing to your Secondary DLT pipeline.

  2. In the Job settings panel on the right, click Add trigger.

  3. Choose the Table update trigger type.

  4. Point it to the specific Unity Catalog table generated by your parent DLT pipeline.

  5. Set Sensitivity: Configure a "Minimum time between triggers" to ensure that if the parent pipeline updates the table multiple times in a short window, the child pipeline doesn't restart unnecessarily.

Hope it will give you the desired behaviour.


RG #Driving Business Outcomes with Data Intelligence

View solution in original post

2 REPLIES 2

MoJaMa
Databricks Employee
Databricks Employee

You can try Table Update Trigger (TUT).

So your pipeline2 will get triggered when there is an "update" to some chosen tables(s) updated in your pipeline1.

https://docs.databricks.com/aws/en/jobs/trigger-table-update

Raman_Unifeye
Honored Contributor III

@Anish_2 - TUT is the solution. in TUT, instead of the parent pipeline "pushing" a notification, the child job is "pulled" into action by a metadata change.

Set it up as below.

  1. Create a Databricks Job and add a Pipeline task pointing to your Secondary DLT pipeline.

  2. In the Job settings panel on the right, click Add trigger.

  3. Choose the Table update trigger type.

  4. Point it to the specific Unity Catalog table generated by your parent DLT pipeline.

  5. Set Sensitivity: Configure a "Minimum time between triggers" to ensure that if the parent pipeline updates the table multiple times in a short window, the child pipeline doesn't restart unnecessarily.

Hope it will give you the desired behaviour.


RG #Driving Business Outcomes with Data Intelligence