12-16-2024 05:24 AM
I'm using Databricks Asset Bundles with Azure DevOps CI/CD for workflow deployment. While the initial deployment to production works fine, I encounter an issue when updating the workflow in the development environment and redeploying it to production. Upon redeployment, all historical run logs are lost. Is there a way to preserve these logs during deployment, perhaps using a command like databricks bundle reset -t prod?
It should display all the log like this
but after deploying everything is gone
12-16-2024 06:04 AM
Hi @ynskrbn,
This is likely because the deployment process overwrites the existing job definitions, which in turn resets the job history. To preserve historical logs, you need to ensure that the job definitions are not completely overwritten during the redeployment process. Instead, you should update the existing jobs while retaining their history
12-16-2024 06:09 AM
I know how to do this with the API, but not with Databricks Asset Bundles in my deployment pipeline. where can i find the documentation?
12-16-2024 06:37 AM
Please let me know if this documentation helps you: https://learn.microsoft.com/en-us/azure/databricks/dev-tools/ci-cd/ci-cd-azure-devops
12-17-2024 11:42 AM
When you re-deploy you job, do you augment the version? (e.g., 4.3.0 -> 4.3.1)
I have been through this, when I change a definition in the databricks.yml, for example when changing the bundle name, because it detects as a new workflow.
Can you explain about the "re-deployment"? In our Azure DevOps CI/CD pipeline, when we generate a new release of the workflow it is just used this command:
databricks bundle deploy --target prod
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now