cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

DLT pipelines in the same job sharing compute

bradleyjamrozik
New Contributor III

bradleyjamrozik_0-1698343161181.png

If I have a job like this that orchestrates N DLT pipelines, what setting do I need to adjust so that they use the same compute resources between steps rather than spinning up and shutting down for each individual pipeline?

 

4 REPLIES 4

bradleyjamrozik
New Contributor III

Edit: I'm using databricks asset bundles to deploy both the pipelines and jobs.

shan_chandra
Esteemed Contributor
Esteemed Contributor

@bradleyjamrozik  - under the DLT settings, notebooks can be listed all together. It will deploy a single compute resource for all the tasks. 

The different pipelines point to different catalogs/schemas so they have to be separated out.

@bradleyjamrozik - I checked internally too. DLT pipelines cannot share resources.