cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Automate the Databricks workflow deployment

joao_vnb
New Contributor III

Hi everyone,

Do you guys know if it's possible to automate the Databricks workflow deployment through azure devops (like what we do with the deployment of notebooks)?

1 ACCEPTED SOLUTION

Accepted Solutions

Anonymous
Not applicable

To automate the deployment of Databricks workflows, you can use the Databricks REST API and a scripting language such as Python or Bash. The script can create a new workflow and add steps to it, as well as manage existing workflows. The script can be triggered by a CI/CD pipeline, allowing for automated deployment of the workflow whenever changes are pushed to the source code repository. This approach can greatly simplify and streamline the deployment process, reducing the risk of manual errors and improving the speed of delivery.

View solution in original post

7 REPLIES 7

Manoj12421
Valued Contributor II

This might help you:-

​https://learn.microsoft.com/en-us/azure/databricks/dev-tools/ci-cd/ci-cd-azure-devops

joao_vnb
New Contributor III

thanks for the answer!! It isn't exactly what I want, though

Debayan
Databricks Employee
Databricks Employee

Hi, yes, it is possible. Please refer: https://learn.microsoft.com/en-us/azure/databricks/dev-tools/ci-cd/ci-cd-azure-devops and let us know if this helps.

joao_vnb
New Contributor III

thanks for the answer!! It isn't exactly what I want, though

Anonymous
Not applicable

To automate the deployment of Databricks workflows, you can use the Databricks REST API and a scripting language such as Python or Bash. The script can create a new workflow and add steps to it, as well as manage existing workflows. The script can be triggered by a CI/CD pipeline, allowing for automated deployment of the workflow whenever changes are pushed to the source code repository. This approach can greatly simplify and streamline the deployment process, reducing the risk of manual errors and improving the speed of delivery.

joao_vnb
New Contributor III

thanks for the answer! could you give an example of the structure of the python script?

or a documentation link where I can find it...

asingamaneni
New Contributor II

Did you get a chance to try Brickflows - https://github.com/Nike-Inc/brickflow

You can find the documentation here - https://engineering.nike.com/brickflow/v0.11.2/

Brickflow uses - Databricks Asset Bundles(DAB) under the hood but provides a Pythonic way to create your workflows.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group