10-26-2023 11:00 AM
If I have a job like this that orchestrates N DLT pipelines, what setting do I need to adjust so that they use the same compute resources between steps rather than spinning up and shutting down for each individual pipeline?
11-07-2023 08:42 AM
Edit: I'm using databricks asset bundles to deploy both the pipelines and jobs.
11-07-2023 11:08 AM
@bradleyjamrozik - under the DLT settings, notebooks can be listed all together. It will deploy a single compute resource for all the tasks.
11-07-2023 11:10 AM
The different pipelines point to different catalogs/schemas so they have to be separated out.
11-13-2023 10:17 AM
@bradleyjamrozik - I checked internally too. DLT pipelines cannot share resources.
never-displayed