โ04-19-2023 02:59 AM
Can we run one workflow for different parameters and different schedule time. so that only one workflow can executed for different parameters we do not have to create that workflow again and again. or we can say Is there any possibility to drive workflow dynamically?
โ04-19-2023 08:56 AM
Assuming by workflow, you mean jobs then yes for parameters using the jobs api
https://learn.microsoft.com/en-us/azure/databricks/dev-tools/api/2.0/jobs
Scheduling is a different matter. This may be possible through the spark_submit_params that the API exposes but not something i have tried. I ended up creating a scheduling engine outside of Databricks which is called by Data Factory which works out what to execute at a given invocation time
The parameter values to be provided to the Jobs API are passed from the scheduling engine into Databricks by Data Factory which will then override the parameter values stored at the Job level.
I am also assuming the underlying code is generic and will run based on the parameter values provided...
โ04-28-2023 01:57 PM
Hi, if you using databricks. can you to take a jobs.
https://docs.databricks.com/workflows/jobs/jobs.html
Ex: Can you crate a notboook with variables parameters .
โ08-28-2023 09:53 AM
Could someone please provide working example on the CLI side using run-now command with JSON job parameters? Is this a bug within the CLI?
I am experience similar problem with the post below and currently on CLI v0.2
โ08-29-2023 02:10 PM
Update / Solved:
Using CLI on Linux/MacOS:
Send in the sample json with job_id in it.
databricks jobs run-now --json '{
"job_id":<job-ID>,
"notebook_params": {
<key>:<value>,
<key>:<value>
}
}'
Using CLI on Windows:
Send in the sample json with job_id in it and sringify the json as below
databricks jobs run-now --json '{ \"job_id\":<job-ID>, \"notebook_params\": { <key>:<value>, <key>:<value> } }'
โ08-29-2023 02:10 PM
Update / Solved:
Using CLI on Linux/MacOS:
Send in the sample json with job_id in it.
databricks jobs run-now --json '{
"job_id":<job-ID>,
"notebook_params": {
<key>:<value>,
<key>:<value>
}
}'
Using CLI on Windows:
Send in the sample json with job_id in it and sringify the json as below
databricks jobs run-now --json '{ \"job_id\":<job-ID>, \"notebook_params\": { <key>:<value>, <key>:<value> } }'
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group