Addressing Pipeline Error Handling in Databricks bundle run with CI/CD when SUCCESS WITH FAILURES
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-08-2024 07:28 AM
I'm using Databricks asset bundles and I have pipelines that contain "if all done rules". When running on CI/CD, if a task fails, the pipeline returns a message like "the job xxxx SUCCESS_WITH_FAILURES" and it passes, potentially deploying a broken pipe in production. I would prefer that the CI/CD throws an error in these cases, not mark it as success. Is there a way to do this, like a parameter in a bundle run? If not, should I recreate my pipe in production with "all done" rules, but in development with "all succeeded" to capture the errors in CI/CD? I understand that I should have a QA environment to test these cases, but unfortunately, that's not the case right now.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-23-2024 04:28 AM
Awesome answer, I will try the first approach. I think it is a less intrusive solution than changing the rules of my pipeline in development scenarios. This way, I can maintain a general pipeline for deployment across all environments. We plan to implement a QA environment after migrating all cloud resources to Terraform. Thanks!

