cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results for 
Search instead for 
Did you mean: 

Run one workflow dynamically with different parameter and schedule time.

yo1
New Contributor II

Can we run one workflow for different parameters and different schedule time. so that only one workflow can executed for different parameters we do not have to create that workflow again and again. or we can say Is there any possibility to drive workflow dynamically?

5 REPLIES 5

labromb
Contributor

Assuming by workflow, you mean jobs then yes for parameters using the jobs api

https://learn.microsoft.com/en-us/azure/databricks/dev-tools/api/2.0/jobs

Scheduling is a different matter. This may be possible through the spark_submit_params that the API exposes but not something i have tried. I ended up creating a scheduling engine outside of Databricks which is called by Data Factory which works out what to execute at a given invocation time

The parameter values to be provided to the Jobs API are passed from the scheduling engine into Databricks by Data Factory which will then override the parameter values stored at the Job level.

I am also assuming the underlying code is generic and will run based on the parameter values provided...

fabio2352
Contributor

Hi, if you using databricks. can you to take a jobs.

https://docs.databricks.com/workflows/jobs/jobs.html

Ex: Can you crate a notboook with variables parameters .

image 

DBXC
Contributor

 

 

Could someone please provide working example on the CLI side using run-now command with JSON job parameters? Is this a bug within the CLI? 

 

I am experience similar problem with the post below and currently on CLI v0.2

https://community.databricks.com/t5/data-engineering/in-azure-databricks-cli-how-to-pass-in-the-para...

Update / Solved: 

Using CLI on Linux/MacOS: 

Send in the sample json with job_id in it. 
databricks jobs run-now --json '{  
 "job_id":<job-ID>,
  "notebook_params": {
    <key>:<value>,
    <key>:<value>
  }
}'

 

Using CLI on Windows: 

Send in the sample json with job_id in it and sringify the json as below
databricks jobs run-now --json '{ \"job_id\":<job-ID>, \"notebook_params\": { <key>:<value>, <key>:<value> } }'

DBXC
Contributor

Update / Solved: 

Using CLI on Linux/MacOS: 

Send in the sample json with job_id in it. 
databricks jobs run-now --json '{  
 "job_id":<job-ID>,
  "notebook_params": {
    <key>:<value>,
    <key>:<value>
  }
}'

 

Using CLI on Windows: 

Send in the sample json with job_id in it and sringify the json as below
databricks jobs run-now --json '{ \"job_id\":<job-ID>, \"notebook_params\": { <key>:<value>, <key>:<value> } }'

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group