if else conditions in databricks asset bundles
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-09-2024 02:09 AM
Can I use if-else conditions in databricks.yml and parameterize my asset bundles similarly to Azure Pipelines YAML?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-09-2024 03:34 AM
Hi @sandy311 ,
Yes, since databricks assets bundles are based on GO templating mechanism you can leverage that. So you can do following:
Solved: Asset Bundles : how to conditionally set content o... - Databricks Community - 77236
---
version: "1.0"
{{- if .topic_sourcing }}
status: enabled
topics:
- name: "my-name"
table_name: "my-tablename"
{{- else }}
status: disabled
{{- end }}
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-09-2024 04:13 AM
If I'm not mistaken, this will only work when we initialize the Databricks asset bundle. What I need is a solution that applies conditions while validating or deploying the bundle.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-09-2024 07:24 AM
Hi @sandy311 ,
Yep, you're right. It won't work with validation.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-09-2024 07:17 AM
Hi @sandy311 ,
Could you please provide more details on what you’re trying to achieve?
It seems like you are looking to use Databricks Asset Bundles as complete CI/CD pipelines. While Databricks Asset Bundles are a crucial part of the CI/CD process, they focus on deploying Databricks resources like notebooks, wheels, and jobs. They do not inherently provide the full capabilities of a CI/CD pipeline such as automated testing, conditional deployments, or rollback mechanisms.
For example, when you mentioned in your previous question about deploying Databricks jobs, validating their execution, and deciding to retain or roll back the deployed code based on the job’s success, this type of workflow is typically managed by a CI/CD platform such as Azure DevOps, GitLab, or GitHub Actions. These platforms allow for advanced control over deployment, testing, and error handling that extends beyond the capabilities of Databricks Asset Bundles alone.
Key Points:
Integration with CI/CD Pipelines: Databricks Asset Bundles are used within CI/CD pipelines but do not replace them. They serve as a mechanism for defining and deploying Databricks resources consistently across environments.
CI/CD Platforms for Workflow Management: For tasks such as conditional deployments, testing job outcomes, and performing rollbacks, you should leverage CI/CD tools like Azure DevOps, GitLab CI/CD, or GitHub Actions. These tools provide the necessary logic and controls to manage complex workflows and automate decision-making processes.
Deployment Control: While Databricks Asset Bundles can include some level of parameterization and conditional logic during initialization (using Go templating or variables), the actual control over validation, testing, and deployment flows is best handled by an external CI/CD tool.
Summary:
Databricks Asset Bundles are designed for deploying Databricks assets as part of a broader CI/CD strategy. To achieve end-to-end CI/CD functionalities such as testing, validation, and rollback, these bundles should be integrated with a dedicated CI/CD platform.

