cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

DLT pipelines in the same job sharing compute

bradleyjamrozik
New Contributor III

bradleyjamrozik_0-1698343161181.png

If I have a job like this that orchestrates N DLT pipelines, what setting do I need to adjust so that they use the same compute resources between steps rather than spinning up and shutting down for each individual pipeline?

 

4 REPLIES 4

bradleyjamrozik
New Contributor III

Edit: I'm using databricks asset bundles to deploy both the pipelines and jobs.

shan_chandra
Honored Contributor III
Honored Contributor III

@bradleyjamrozik  - under the DLT settings, notebooks can be listed all together. It will deploy a single compute resource for all the tasks. 

The different pipelines point to different catalogs/schemas so they have to be separated out.

@bradleyjamrozik - I checked internally too. DLT pipelines cannot share resources. 

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.