@Anish_2 - TUT is the solution. in TUT, instead of the parent pipeline "pushing" a notification, the child job is "pulled" into action by a metadata change.
Set it up as below.
Create a Databricks Job and add a Pipeline task pointing to your Secondary DLT pipeline.
In the Job settings panel on the right, click Add trigger.
Choose the Table update trigger type.
Point it to the specific Unity Catalog table generated by your parent DLT pipeline.
Set Sensitivity: Configure a "Minimum time between triggers" to ensure that if the parent pipeline updates the table multiple times in a short window, the child pipeline doesn't restart unnecessarily.
Hope it will give you the desired behaviour.
RG #Driving Business Outcomes with Data Intelligence