cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
cancel
Showing results for 
Search instead for 
Did you mean: 

Run one workflow dynamically with different parameter and schedule time.

yo1
New Contributor II

Can we run one workflow for different parameters and different schedule time. so that only one workflow can executed for different parameters we do not have to create that workflow again and again. or we can say Is there any possibility to drive workflow dynamically?

6 REPLIES 6

labromb
Contributor

Assuming by workflow, you mean jobs then yes for parameters using the jobs api

https://learn.microsoft.com/en-us/azure/databricks/dev-tools/api/2.0/jobs

Scheduling is a different matter. This may be possible through the spark_submit_params that the API exposes but not something i have tried. I ended up creating a scheduling engine outside of Databricks which is called by Data Factory which works out what to execute at a given invocation time

The parameter values to be provided to the Jobs API are passed from the scheduling engine into Databricks by Data Factory which will then override the parameter values stored at the Job level.

I am also assuming the underlying code is generic and will run based on the parameter values provided...

fabio2352
Contributor

Hi, if you using databricks. can you to take a jobs.

https://docs.databricks.com/workflows/jobs/jobs.html

Ex: Can you crate a notboook with variables parameters .

image 

DBXC
Contributor

 

 

Could someone please provide working example on the CLI side using run-now command with JSON job parameters? Is this a bug within the CLI? 

 

I am experience similar problem with the post below and currently on CLI v0.2

https://community.databricks.com/t5/data-engineering/in-azure-databricks-cli-how-to-pass-in-the-para...

Kaniz
Community Manager
Community Manager

Update / Solved: 

Using CLI on Linux/MacOS: 

Send in the sample json with job_id in it. 
databricks jobs run-now --json '{  
 "job_id":<job-ID>,
  "notebook_params": {
    <key>:<value>,
    <key>:<value>
  }
}'

 

Using CLI on Windows: 

Send in the sample json with job_id in it and sringify the json as below
databricks jobs run-now --json '{ \"job_id\":<job-ID>, \"notebook_params\": { <key>:<value>, <key>:<value> } }'

DBXC
Contributor

Update / Solved: 

Using CLI on Linux/MacOS: 

Send in the sample json with job_id in it. 
databricks jobs run-now --json '{  
 "job_id":<job-ID>,
  "notebook_params": {
    <key>:<value>,
    <key>:<value>
  }
}'

 

Using CLI on Windows: 

Send in the sample json with job_id in it and sringify the json as below
databricks jobs run-now --json '{ \"job_id\":<job-ID>, \"notebook_params\": { <key>:<value>, <key>:<value> } }'

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.