cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Schedule job runs with different parameters?

129876
New Contributor III

Is it possible to schedule different runs for job with parameters? I have a notebook that generates data based on the supplied parameter but would like to schedule runs instead of manually starting them. I assume that this would be possible using the REST API but would like to avoid having my token in plaintext. Would git integration have something for this?

6 REPLIES 6

Debayan
Esteemed Contributor III
Esteemed Contributor III

You can pass parameters for your task. Each task type has different requirements for formatting and passing the parameters.

https://docs.databricks.com/workflows/jobs/jobs.html#create-a-job

REST API can also pass parameters fro jobs. Tokens replace passwords in an authentication flow and should be protected like passwords. To protect tokens, Databricks recommends that you store tokens in:

Please refer: https://docs.databricks.com/dev-tools/api/latest/authentication.html

Also, you can store the tokens in .netrc file and use them in curl : https://docs.databricks.com/dev-tools/api/latest/authentication.html#store-tokens-in-a-netrc-file-an...

For GIT integration please refer: https://docs.databricks.com/repos/index.html.

Please let us know if this helps.

129876
New Contributor III

Yes I am aware of these. I wanted to know if I can schedule tasks with different parameters, is there any functionality in Databricks that would allow this?

Kaniz
Community Manager
Community Manager

Hi @k.b.โ€‹,

To Run a job with different parameters,

You can use Run Now with Different Parameters to re-run a job with different parameters or values for existing parameters.

  • Click the down arrow next to Run Now and select Run Now with Different Parameters or, in the Active Runs table, click Run Now with Different Parameters. Enter the new parameters depending on the type of task.

  • Notebook: You can enter parameters as key-value pairs or JSON objects. You can use this dialogue to set the values of widgets.

  • JAR and spark-submit: You can enter a list of parameters or a JSON document. The provided parameters are merged with the default parameters for the triggered run. If you delete keys, the default parameters are used. You can also add task parameter variables for the run.

  • Click Run.

Source

129876
New Contributor III

I want to schedule this process daily. To run automatically.

Kaniz
Community Manager
Community Manager

Hi @k.b.โ€‹, You can create and manage notebook jobs directly in the notebook UI, as given in this article.

You can create and manage schedules if a notebook is already assigned to one or more jobs.

If a notebook is not assigned to a job, you can create a job and a schedule to run the notebook.

Please let us know if this helps.

129876
New Contributor III

Not really... I have different values for a single parameter within a notebook. Can I schedule under a single job, runs for each different value?

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.