cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Is there a way to let the DLT pipeline retry by itself?

guangyi
Contributor II

I know I can make the workflow job retry automatically by adding following properties in the YAML file: max_retries or min_retry_interval_millis.

However, I cannot find similar attributes in any DLT pipeline document. When I ask copilot it gives this answer:

Screenshot 2024-07-26 at 14.03.50.png

I try to add these two attributes in the asset bundle YAML file. After deploying on the prod, I found these two properties actually added on the DLT pipeline setting > advanced > configuration section, but it still cannot retry automatically

Is there a way to let the DLT retry by itself?

1 ACCEPTED SOLUTION

Accepted Solutions

szymon_dybczak
Contributor III

Hi @guangyi ,

In DLT you have following two properties that you can set:

pipelines.maxFlowRetryAttempts

Type: int

The maximum number of attempts to retry a flow before failing a pipeline update when a retryable failure occurs.

The default value is two. By default, when a retryable failure occurs, the Delta Live Tables runtime attempts to run the flow three times including the original attempt.

pipelines.numUpdateRetryAttempts

Type: int

The maximum number of attempts to retry an update before failing the update when a retryable failure occurs. The retry is run as a full update.

The default is five. This parameter applies only to triggered updates run in production mode. There is no retry when your pipeline runs in development mode.

You can also take a look on below answers:

Delta Live Tables - RETRY_ON_FAILURE - Databricks Community - 5926

Solved: DLT Pipeline Retries - Databricks Community - 30388




View solution in original post

1 REPLY 1

szymon_dybczak
Contributor III

Hi @guangyi ,

In DLT you have following two properties that you can set:

pipelines.maxFlowRetryAttempts

Type: int

The maximum number of attempts to retry a flow before failing a pipeline update when a retryable failure occurs.

The default value is two. By default, when a retryable failure occurs, the Delta Live Tables runtime attempts to run the flow three times including the original attempt.

pipelines.numUpdateRetryAttempts

Type: int

The maximum number of attempts to retry an update before failing the update when a retryable failure occurs. The retry is run as a full update.

The default is five. This parameter applies only to triggered updates run in production mode. There is no retry when your pipeline runs in development mode.

You can also take a look on below answers:

Delta Live Tables - RETRY_ON_FAILURE - Databricks Community - 5926

Solved: DLT Pipeline Retries - Databricks Community - 30388




Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group