cancel
Showing results for 
Search instead for 
Did you mean: 
Knowledge Sharing Hub
Dive into a collaborative space where members like YOU can exchange knowledge, tips, and best practices. Join the conversation today and unlock a wealth of collective wisdom to enhance your experience and drive success.
cancel
Showing results for 
Search instead for 
Did you mean: 

Orchestrate Databricks jobs with Apache Airflow

Sourav-Kundu
Contributor

You can Orchestrate Databricks jobs with Apache Airflow

The Databricks provider implements the below operators:

DatabricksCreateJobsOperator : Create a new Databricks job or reset an existing job

DatabricksRunNowOperator : Runs an existing Spark job run to Databricks using the api/2.1/jobs/run-now API endpoint.

DatabricksSubmitRunOperator : Submits a Spark job run to Databricks using the api/2.1/jobs/runs/submit API endpoint.

DatabricksCopyIntoOperator : Executes COPY INTO command in a Databricks SQL endpoint or a Databricks cluster.

DatabricksSqlOperator : Executes SQL code in a Databricks SQL endpoint or a Databricks cluster

https://docs.databricks.com/en/jobs/how-to/use-airflow-with-jobs.html

@Advika_ 

1 REPLY 1

Advika_
Databricks Employee
Databricks Employee

Good one @Sourav-Kundu! Your clear explanations of the operators really simplify job management, plus the resource link you included makes it easy for everyone to dive deeper 👍.

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now