cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

scheduling a job with multiple notebooks using common parameter

Rk2
New Contributor II

I have a practical use case​

three notebooks (pyspark ) all have on​e common parameter.

need to schedule all three notebooks in a sequence

is there any way to run them by setting one parameter value, as they are same in all.

please suggest the best way to do it.

1 ACCEPTED SOLUTION

Accepted Solutions

Hubert-Dudek
Esteemed Contributor III

@Ramesh Kotha​ , in notebook get parameter like that:

my_parameter = dbutils.widgets.get("my_parameter")

and set it in a task like that:

image.png

View solution in original post

3 REPLIES 3

Hubert-Dudek
Esteemed Contributor III

@Ramesh Kotha​ , in notebook get parameter like that:

my_parameter = dbutils.widgets.get("my_parameter")

and set it in a task like that:

image.png

Hi @Ramesh Kotha​ ,

Just a friendly follow-up. Did @Hubert Dudek​ response help you to resolve your questions/issue? Please let us know

Kaniz
Community Manager
Community Manager

Hi @Ramesh Kotha​ , Just a friendly follow-up. Do you still need help, or @Hubert Dudek (Customer)​ 's response help you to find the solution? Please let us know.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.