cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Parameterizing DLT Jobs

leelee3000
New Contributor III
New Contributor III

I have observed the use of advanced configuration and creating a map as a way to parameterize notebooks, but these appear to be cluster-wide settings. Is there a recommended best practice for directly passing parameters to notebooks running on a DLT cluster? Specifically, if I have multiple notebooks running simultaneously and want to pass unique parameters to each (e.g., {“name”:”alpha”} for notebook1 and {“name”:”beta”} for notebook2), how can I achieve this without impacting other notebooks on the same DLT cluster? #deltalivetables

1 REPLY 1
Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.