โ09-28-2022 03:20 AM
I need to execute a DLT pipeline from a Job, and I would like to know if there is any way of passing a parameter.
I know you can have settings in the pipeline that you use in the DLT notebook, but it seems you can only assign values to them when creating the pipeline. What I would like is to specify a value each time the pipeline is executed. Is that possible?
Thank you all.
โ09-28-2022 06:34 AM
You can call the API to update the pipeline to change the value or inside your notebook, you can make a call to a Delta Table, SQL, etc... to return the value needing to be used.
โ09-28-2022 06:41 AM
Hi, what kind of parameters are you trying to change at running time? Is this the one:
https://docs.databricks.com/workflows/delta-live-tables/delta-live-tables-configuration.html
(Delta Live Tables settings specify one or more notebooks that implement a pipeline and the parameters specifying how to run the pipeline in an environment, for example, development, staging, or production. Delta Live Tables settings are expressed as JSON and can be modified in the Delta Live Tables UI.)
โ09-29-2022 03:30 AM
Hi @Josรฉ Fernรกndez Vizosoโ , We havenโt heard from you on the last response from @kfoster and @Debayan Mukherjeeโ and I was checking back to see if their suggestions helped you.
Or else, If you have any solution, please do share that with the community as it can be helpful to others.
Also, Please don't forget to click on the "Select As Best" button whenever the information provided helps resolve your question.
โ04-19-2024 04:11 PM
โ08-07-2024 10:10 AM - edited โ08-07-2024 10:12 AM
This seems to be the key to this question:
My understanding of this is that you can add the parameter either in the DLT settings UI via Advanced Config/Add Configuration, key, value dialog. Or via the corresponding pipeline settings JSON like this using my custom parameter "crawls_params" = "last 4".
"configuration": {
"crawls_param": "last 4"
},
This would require changing those setting each time you run the job. But, I'm not clear how you would pass and subsequently read a parameter if a pipeline is added as a task to a job. Jobs and tasks allow for passing of parameters, just not clear on how to get those into the DLT pipeline.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group