cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Parameterizing DLT Pipelines

Mustafa_Kamal
New Contributor II

Hi Everyone,

I have DLTP pipeline which I need to execute for difference source systems. Need advise on how to parametrize this.

I have gone through many articles on the web, but it seems there is no accurate information available.

Can anyone please help.

Thanks,

4 REPLIES 4

AmanSehgal
Honored Contributor III

You can provide parameters in the configuration section of DLT pipeline and access it in your code using spark.conf.get(<parameter_name>).

Parameterize DLT pipelines 

Mustafa_Kamal
New Contributor II

Thank you @AmanSehgal ,

I have done that and was able to execute the pipeline successfully. Bu t I need to change the parameter value at run time, so that the same pipeline can be used for multiple sources.

Can we pass parameters from Job to DLT Pipeline ?

 

AmanSehgal
Honored Contributor III

I tried but it doesn't seem to be working. I tried passing job parameters to cluster configuration as {{job.paramters.name}} but it didn't pickup.

You might have to put a wrapper logic around your pipeline for now to parameterize it. For instance, write some text into a file in mounted location and then read it in the DLT pipeline for your conditional logic.

Hi @AmanSehgal , Yes, thats true, I had tried different options here. Thank you so much for taking it time to investigate this.