Databricks Asset Bundle to deploy only one workflow
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-20-2024 08:46 PM
Hello Community -
I am trying to deploy only one workflow from my CICD. But whenever I am trying to deploy one workflow using "databricks bundle deploy - prod", it is deleting all the existing workflow in the target environment. Is there any option available to deploy only one workflow without impacting the other workflows in databricks asset bundle? Thanks!!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-21-2024 08:37 PM
Hi @niruban
In ideal scenario it should not delete the exsting workflow
I am also using databricks bundle to deploy the prod env but it has never deleted the current workflow.
Can you send me the snippet where you can see workflow is delete?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-22-2024 05:59 AM
@Rajani : This is what I am doing. I am having git actions to kick off which will run
- name: bundle-deploy
run: |
cd ${{ vars.HOME }}/dev-ops/databricks_cicd_deployment
databricks bundle deploy --debug
Before running this step, I am creating the yaml file of the modified/newly created workflow alone under resource directory of DABs folder setup. Once I execute the above command it will remove all the other workflows which is not available under resource folder path in the target server.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-10-2024 09:23 AM
Hi Team, the deployment via DAB(Databricks Asset Bundle) reads all yml files present and based on that workflows are generated. In the previous versions of Databricks CLI prior to 0.236(or latest one), it use to delete all the workflow by making delete workflow API calls and then regenerating only those which are present as YML configs. But now, all it does is copy all workspace files to Databricks and then deploys the YML files present as workflow(and removing those which are not present as yml configs) and it does not interfere in any of existing running workloads at the time of deployment.
But since, there is no command yet to provide single workflow deployment so its the default behavior of DAB. But, in future release of Databricks CLI, they should definitely provide support for deploying selected workflows by YML rather than whole bundle.
Hope this helps.

