cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Can I pass parameters to a Delta Live Table pipeline at running time?

jfvizoso
New Contributor II

I need to execute a DLT pipeline from a Job, and I would like to know if there is any way of passing a parameter.

I know you can have settings in the pipeline that you use in the DLT notebook, but it seems you can only assign values to them when creating the pipeline. What I would like is to specify a value each time the pipeline is executed. Is that possible?

Thank you all.

4 REPLIES 4

kfoster
Contributor

You can call the API to update the pipeline to change the value or inside your notebook, you can make a call to a Delta Table, SQL, etc... to return the value needing to be used.

Debayan
Databricks Employee
Databricks Employee

Hi, what kind of parameters are you trying to change at running time? Is this the one:

https://docs.databricks.com/workflows/delta-live-tables/delta-live-tables-configuration.html

(Delta Live Tables settings specify one or more notebooks that implement a pipeline and the parameters specifying how to run the pipeline in an environment, for example, development, staging, or production. Delta Live Tables settings are expressed as JSON and can be modified in the Delta Live Tables UI.)

Mustafa_Kamal
New Contributor II

Hi @jfvizoso ,

I also have the same scenario, did you find any work around.

Thanks in advance.

lprevost
Contributor

This seems to be the key to this question:

parameterize for dlt 

 

My understanding of this is that you can add the parameter either in the DLT settings UI via Advanced Config/Add Configuration, key, value dialog.   Or via the corresponding pipeline settings JSON like this using my custom parameter "crawls_params" = "last 4".

 

"configuration": {
        "crawls_param": "last 4"
    },

 

This would require changing those setting each time you run the job.   But, I'm not clear how you would pass and subsequently read a parameter if a pipeline is added as a task to a job.   Jobs and tasks allow for passing of parameters, just not clear on how to get those into the DLT pipeline.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group