MuthuLakshmi
Databricks Employee
Databricks Employee

@skumarrm 
Please try the below:

Set Up Task Parameters:

    • In the job configuration, you can set up task parameters to pass values from one task to another.
    • For TASK1 (DLT), ensure it outputs the PipelineID or PipelineName.

Use Task Parameters in TASK2:

    • In the notebook for TASK2, use the dbutils.widgets API to retrieve the parameters passed from TASK1.

Here’s an example of how you can set this up:

TASK1 (DLT Pipeline)

  • Ensure your DLT pipeline is configured and running.

TASK2 (Non-DLT Notebook)

  • Create a notebook with the following code to retrieve the parameters:

# Retrieve the PipelineID or 

PipelineName passed from TASK1

pipeline_id = dbutils.widgets.get

("PipelineID")

pipeline_name = dbutils.widgets.get

("PipelineName")

# Use the parameters in your notebook 

logic

print(f"PipelineID: {pipeline_id}")

print(f"PipelineName: {pipeline_name}")

# Your notebook logic here

Job Configuration

  • In the job configuration, set up TASK1 to run the DLT pipeline.
  • Add TASK2 and configure it to run the notebook created above.
  • In the task settings for TASK2, add parameters to retrieve the PipelineID and PipelineName from TASK1.