cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Deploy Workflow only to specific target (Databricks Asset Bundles)

johnb1
Contributor

I am using Databricks Asset Bundles to deploy Databricks workflows to all of my target environments (dev, staging, prod). However, I have one specific workflow that is supposed to be deployed only to the dev target environment.

How can I implement that this specific workflow is only deployed to dev target?

I was searching for information to do that on the workflow level (Create a new job | Jobs API | REST API reference | Azure Databricks). However, according  this does not seem to be possible.

Any other location where I can specify this?

5 REPLIES 5

filipniziol
New Contributor III

Hi @johnb1 ,
To deploy to specifitc target, you can deploy with -t flag:

databricks bundle deploy -t dev

Check the docs:
bundle command group | Databricks on AWS

szymon_dybczak
Contributor

Hi @johnb1 ,

You can achieve what you want by using targets mapping. For dev target define job in resource submapping that you only want to have in that environment.

https://docs.databricks.com/en/dev-tools/bundles/settings.html

szymon_dybczak_0-1726489293705.png

 

Thank you for that suggestion @szymon_dybczak. This looks helpful. My workflow is already defined in another .yaml file, as I have stored all workflow configs - each in one file - in a central folder. Thing is I do not want to define the workflow under targets in my DAB root file (mostly called databricks.yaml), as you suggested, but rather reference it. Is this possible?

 

Hi @johnb1 ,

I'm not sure if you can reference yaml in targets mapping, but I can check it tommorrow morning.
But if I had to guess, based on the documentation entry below, I would say that there is currently no way to reference another yaml file in targets mapping:

"The include array specifies a list of path globs that contain configuration files to include within the bundle. These path globs are relative to the location of the bundle configuration file in which the path globs are specified.

The Databricks CLI does not include any configuration files by default within the bundle. You must use the include array to specify any and all configuration files to include within the bundle, other than the databricks.yml file itself.

This include array can appear only as a top-level mapping."

So, the easiest way to achieve what you want is to do something similiar. But, tommorrow I will confirm if we can reference yaml file.

szymon_dybczak_0-1726495087989.png

 

Thanks @szymon_dybczak that would be great ๐Ÿ‘Œ

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group