cancel
Showing results for 
Search instead for 
Did you mean: 
Knowledge Sharing Hub
Dive into a collaborative space where members like YOU can exchange knowledge, tips, and best practices. Join the conversation today and unlock a wealth of collective wisdom to enhance your experience and drive success.
cancel
Showing results for 
Search instead for 
Did you mean: 

Orchestrate Databricks jobs with Apache Airflow

Sourav-Kundu
Contributor

You can Orchestrate Databricks jobs with Apache Airflow

The Databricks provider implements the below operators:

DatabricksCreateJobsOperator : Create a new Databricks job or reset an existing job

DatabricksRunNowOperator : Runs an existing Spark job run to Databricks using the api/2.1/jobs/run-now API endpoint.

DatabricksSubmitRunOperator : Submits a Spark job run to Databricks using the api/2.1/jobs/runs/submit API endpoint.

DatabricksCopyIntoOperator : Executes COPY INTO command in a Databricks SQL endpoint or a Databricks cluster.

DatabricksSqlOperator : Executes SQL code in a Databricks SQL endpoint or a Databricks cluster

https://docs.databricks.com/en/jobs/how-to/use-airflow-with-jobs.html

@Advika 

1 REPLY 1

Advika
Databricks Employee
Databricks Employee

Good one @Sourav-Kundu! Your clear explanations of the operators really simplify job management, plus the resource link you included makes it easy for everyone to dive deeper 👍.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group