09-16-2024 02:37 AM - edited 09-16-2024 02:49 AM
I am using Databricks Asset Bundles to deploy Databricks workflows to all of my target environments (dev, staging, prod). However, I have one specific workflow that is supposed to be deployed only to the dev target environment.
How can I implement that this specific workflow is only deployed to dev target?
I was searching for information to do that on the workflow level (Create a new job | Jobs API | REST API reference | Azure Databricks). However, according this does not seem to be possible.
Any other location where I can specify this?
09-20-2024 02:00 AM
Hi @johnb1 ,
Sorry, I checked it but forgot to post reply. I can confirm that currently it's not possible. I'll setup a request for this feature on databricks idea portal.
09-16-2024 04:56 AM - edited 09-16-2024 04:58 AM
Hi @johnb1 ,
To deploy to specifitc target, you can deploy with -t flag:
databricks bundle deploy -t dev
Check the docs:
bundle command group | Databricks on AWS
09-16-2024 05:21 AM - edited 09-16-2024 05:23 AM
Hi @johnb1 ,
You can achieve what you want by using targets mapping. For dev target define job in resource submapping that you only want to have in that environment.
https://docs.databricks.com/en/dev-tools/bundles/settings.html
09-16-2024 05:51 AM
Thank you for that suggestion @szymon_dybczak. This looks helpful. My workflow is already defined in another .yaml file, as I have stored all workflow configs - each in one file - in a central folder. Thing is I do not want to define the workflow under targets in my DAB root file (mostly called databricks.yaml), as you suggested, but rather reference it. Is this possible?
09-16-2024 06:44 AM - edited 09-16-2024 06:58 AM
Hi @johnb1 ,
I'm not sure if you can reference yaml in targets mapping, but I can check it tommorrow morning.
But if I had to guess, based on the documentation entry below, I would say that there is currently no way to reference another yaml file in targets mapping:
"The include array specifies a list of path globs that contain configuration files to include within the bundle. These path globs are relative to the location of the bundle configuration file in which the path globs are specified.
The Databricks CLI does not include any configuration files by default within the bundle. You must use the include array to specify any and all configuration files to include within the bundle, other than the databricks.yml file itself.
This include array can appear only as a top-level mapping."
So, the easiest way to achieve what you want is to do something similiar. But, tommorrow I will confirm if we can reference yaml file.
09-16-2024 06:57 AM
Thanks @szymon_dybczak that would be great 👌
09-20-2024 01:41 AM
Did you have a chance to check if such referencing of a workflow from another yaml file is possible?
Given you did not reply, I assume it is not possible. This would also confirm what I found during my tests.
If you know otherwise, however, please let me know 😀
09-20-2024 02:00 AM
Hi @johnb1 ,
Sorry, I checked it but forgot to post reply. I can confirm that currently it's not possible. I'll setup a request for this feature on databricks idea portal.
09-24-2024 01:43 AM
Thanks for getting back and clarifying @szymon_dybczak
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group