โ02-06-2023 02:08 AM
Suppose there are multiple job have been created using databricks workflow, now the requirement is to make one master workflow to trigger all the workflow depending on different condition like: some are supposed to trigger on daily basis, some on monthly basis, and some on weekly basis. Other condition to be checked is, before executing a job we need to check its dependency over other job, if So then we have to execute that workflow first then trigger next.
โ02-06-2023 03:12 AM
Hi @Riya Vadhwaniโ ,
Till now you cannot call databricks workflow job from another job, you can use databricks workflow job API to run your other job.
Please refer to this link for more information regarding JOB API
โ04-08-2023 12:34 AM
Hi @Riya Vadhwaniโ
Hope everything is going great.
Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so we can help you.
Cheers!
โ07-14-2023 02:40 AM - edited โ07-14-2023 02:40 AM
A nice way to do this is to use the databricks python sdk :
(%pip install databricks-sdk)
from databricks.sdk import WorkspaceClient
if condition:
DATABRICKS_HOST = "your-hostname"
DATABRICKS_TOKEN = "your-personnal-access-token"
w = WorkspaceClient(host=DATABRICKS_HOST, token=DATABRICKS_TOKEN)
w.jobs.run_now(your-job-id)
โ07-14-2023 06:42 AM
@Ria Hi , This feature was in development when I attended last Quarter Roadmap and I thought it is available in the latest versions or could be even in Private Preview. You can check with your Databricks Solution Architect. Even if not now, could be coming up pretty soon.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group