You can Orchestrate Databricks jobs with Apache Airflow
The Databricks provider implements the below operators:
DatabricksCreateJobsOperator : Create a new Databricks job or reset an existing job
DatabricksRunNowOperator : Runs an existing Spark job run to Databricks using the api/2.1/jobs/run-now API endpoint.
DatabricksSubmitRunOperator : Submits a Spark job run to Databricks using the api/2.1/jobs/runs/submit API endpoint.
DatabricksCopyIntoOperator : Executes COPY INTO command in a Databricks SQL endpoint or a Databricks cluster.
DatabricksSqlOperator : Executes SQL code in a Databricks SQL endpoint or a Databricks cluster
https://docs.databricks.com/en/jobs/how-to/use-airflow-with-jobs.html
@Advika