Hello Community,
I'm encountering a challenging issue with my Azure Pipeline and I'm hoping someone here might have some insights. I'm attempting to deploy a Databricks bundle that includes both notebooks and workflow YAML files. When deploying the bundle locally via the Databricks CLI, everything works perfectly – both notebooks and YAML files are uploaded correctly into my databricks environment.
Also I identified, that if I run "databricks bundle deploy" on my local environment, it creates three folders in databricks:
- artifacts, files, state
But if I deploy on my azure pipeline the "state"-folder is also missing.
The problem arises when I try to deploy the same bundle through my Azure Pipeline. In this case, only the notebooks are being uploaded, and the workflow YAML files are missing. I've made sure that the CLI versions match and that the pipeline configuration is correct. The verbose logs don't show any obvious errors, and the environment variables all seem to be in order as well. It just does not deploy my workflows.
Has anyone experienced a similar issue or have any idea what might be going wrong here? Any tips or directions to investigate would be greatly appreciated.
Thank you in advance for your support!
Best Regards