cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

DLT pipelines in the same job sharing compute

bradleyjamrozik
New Contributor III

bradleyjamrozik_0-1698343161181.png

If I have a job like this that orchestrates N DLT pipelines, what setting do I need to adjust so that they use the same compute resources between steps rather than spinning up and shutting down for each individual pipeline?

 

5 REPLIES 5

bradleyjamrozik
New Contributor III

Edit: I'm using databricks asset bundles to deploy both the pipelines and jobs.

shan_chandra
Databricks Employee
Databricks Employee

@bradleyjamrozik  - under the DLT settings, notebooks can be listed all together. It will deploy a single compute resource for all the tasks. 

The different pipelines point to different catalogs/schemas so they have to be separated out.

@bradleyjamrozik - I checked internally too. DLT pipelines cannot share resources. 

RasmusBrostroem
New Contributor II

@shan_chandra Hello, I have the same issue, where I have a job that uses serverless compute and in this job I do some different tasks and then I start a DLT pipeline, which also uses serverless compute and this means that the job again have to wait for a resource to start. 

Has anything changed since 2023 and if not, then is it something that will be possible in the future? For my use case it really limits the possibilities, since for the overall job there are now 10 minutes of where resources have to start, which isn't optimal, when this job has to run each hour and has to provide results quickly.