cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Obtain refresh mode from within Delta Live Table pipeline run

NemesisMF
New Contributor II

Is it possible to obtain somehow if a DLT pipeline run is running in Full Refresh or incremental mode from within a notebook running in the pipeline?I looked into the pipeline configuration variables but could not find anything.

It would be benefitial to have this information from within the code and do something different in case of a full refresh.

 

My workaround is so far to have two pipeline jobs and set a config variable if it is running in full refresh, but when executing the pipeline manually this gets dangerous since I have to remind myself to the the value to the correct refresh type.

3 REPLIES 3

Walter_C
Databricks Employee
Databricks Employee

You can use the following code:

pipeline_run_config = get_current_pipeline_run_config()

# Create a StartUpdate object from the pipeline run configuration
start_update = StartUpdate.from_dict(pipeline_run_config)

# Check if the pipeline run is a full refresh
if start_update.full_refresh:
    print("The pipeline is running in Full Refresh mode.")
else:
    print("The pipeline is running in Incremental mode.")

NemesisMF
New Contributor II

Thanks for the quick answer. Where did you get the

get_current_pipeline_run_config()

from? I used spark.conf.getAll which apperently does not have the refresh mode info.

StartUpdate

comes from the databricks.sdk.service.pipelines?

Walter_C
Databricks Employee
Databricks Employee

I am looking further on this with our teams, can you please provide us with more context on your usecase for this information?

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group