cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Deploy Workflow only to specific target (Databricks Asset Bundles)

johnb1
Contributor

I am using Databricks Asset Bundles to deploy Databricks workflows to all of my target environments (dev, staging, prod). However, I have one specific workflow that is supposed to be deployed only to the dev target environment.

How can I implement that this specific workflow is only deployed to dev target?

I was searching for information to do that on the workflow level (Create a new job | Jobs API | REST API reference | Azure Databricks). However, according  this does not seem to be possible.

Any other location where I can specify this?

1 ACCEPTED SOLUTION

Accepted Solutions

Hi @johnb1 ,

Sorry, I checked it but forgot to post reply.  I can confirm that currently it's not possible. I'll setup a request for this feature on databricks idea portal.

 

View solution in original post

8 REPLIES 8

filipniziol
Contributor

Hi @johnb1 ,
To deploy to specifitc target, you can deploy with -t flag:

databricks bundle deploy -t dev

Check the docs:
bundle command group | Databricks on AWS

szymon_dybczak
Contributor III

Hi @johnb1 ,

You can achieve what you want by using targets mapping. For dev target define job in resource submapping that you only want to have in that environment.

https://docs.databricks.com/en/dev-tools/bundles/settings.html

szymon_dybczak_0-1726489293705.png

 

Thank you for that suggestion @szymon_dybczak. This looks helpful. My workflow is already defined in another .yaml file, as I have stored all workflow configs - each in one file - in a central folder. Thing is I do not want to define the workflow under targets in my DAB root file (mostly called databricks.yaml), as you suggested, but rather reference it. Is this possible?

 

Hi @johnb1 ,

I'm not sure if you can reference yaml in targets mapping, but I can check it tommorrow morning.
But if I had to guess, based on the documentation entry below, I would say that there is currently no way to reference another yaml file in targets mapping:

"The include array specifies a list of path globs that contain configuration files to include within the bundle. These path globs are relative to the location of the bundle configuration file in which the path globs are specified.

The Databricks CLI does not include any configuration files by default within the bundle. You must use the include array to specify any and all configuration files to include within the bundle, other than the databricks.yml file itself.

This include array can appear only as a top-level mapping."

So, the easiest way to achieve what you want is to do something similiar. But, tommorrow I will confirm if we can reference yaml file.

szymon_dybczak_0-1726495087989.png

 

Thanks @szymon_dybczak that would be great 👌

johnb1
Contributor

Hi @szymon_dybczak 

Did you have a chance to check if such referencing of a workflow from another yaml file is possible?

Given you did not reply, I assume it is not possible. This would also confirm what I found during my tests.

If you know otherwise, however, please let me know 😀

Hi @johnb1 ,

Sorry, I checked it but forgot to post reply.  I can confirm that currently it's not possible. I'll setup a request for this feature on databricks idea portal.

 

johnb1
Contributor

Thanks for getting back and clarifying @szymon_dybczak 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group