How to Version & Deploy Databricks Workflows with Azure DevOps (CI/CD)?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-19-2025 07:25 AM
Hi everyone,
I’m trying to set up versioning and CI/CD for my Databricks workflows using Azure DevOps and Git. While I’ve successfully versioned notebooks in a Git repo, I’m struggling with handling workflows (which define orchestration, dependencies, schema, etc.).
How can I properly version and deploy Databricks workflows across different environments (Dev, Test, Prod) using Azure DevOps?
Thanks in advance!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-19-2025 07:28 AM
@Alberto_Umana and @nicole_lu_PM maybe you have a clue? Would DABs be useful in this case?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-19-2025 07:33 AM
As of now, my current approach is to manually copy/paste YAMLs across workspaces and version them using Git/Azure DevOps by saving them as DBFS files. The CD process is then handled using Databricks DBFS File Deployment by Data Thirst Ltd.
While this works, I’m still looking for a more automated and scalable solution. Has anyone found a better way to manage Databricks workflow versioning and deployment in a CI/CD setup? Would love to hear your insights!

