cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

How to Version & Deploy Databricks Workflows with Azure DevOps (CI/CD)?

mkEngineer
New Contributor III

Hi everyone,

I’m trying to set up versioning and CI/CD for my Databricks workflows using Azure DevOps and Git. While I’ve successfully versioned notebooks in a Git repo, I’m struggling with handling workflows (which define orchestration, dependencies, schema, etc.).

How can I properly version and deploy Databricks workflows across different environments (Dev, Test, Prod) using Azure DevOps?

Thanks in advance!

2 REPLIES 2

mkEngineer
New Contributor III

@Alberto_Umana and @nicole_lu_PM maybe you have a clue? Would DABs be useful in this case? 

 

mkEngineer
New Contributor III

As of now, my current approach is to manually copy/paste YAMLs across workspaces and version them using Git/Azure DevOps by saving them as DBFS files. The CD process is then handled using Databricks DBFS File Deployment by Data Thirst Ltd.

While this works, I’m still looking for a more automated and scalable solution. Has anyone found a better way to manage Databricks workflow versioning and deployment in a CI/CD setup? Would love to hear your insights!