Orchestrate Databricks jobs with Apache Airflow
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-29-2024 04:58 AM - edited 10-29-2024 05:02 AM
You can Orchestrate Databricks jobs with Apache Airflow
The Databricks provider implements the below operators:
DatabricksCreateJobsOperator : Create a new Databricks job or reset an existing job
DatabricksRunNowOperator : Runs an existing Spark job run to Databricks using the api/2.1/jobs/run-now API endpoint.
DatabricksSubmitRunOperator : Submits a Spark job run to Databricks using the api/2.1/jobs/runs/submit API endpoint.
DatabricksCopyIntoOperator : Executes COPY INTO command in a Databricks SQL endpoint or a Databricks cluster.
DatabricksSqlOperator : Executes SQL code in a Databricks SQL endpoint or a Databricks cluster
https://docs.databricks.com/en/jobs/how-to/use-airflow-with-jobs.html
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-29-2024 07:21 AM
Good one @Sourav-Kundu! Your clear explanations of the operators really simplify job management, plus the resource link you included makes it easy for everyone to dive deeper 👍.

