cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Multiple DLT Pipelines Sharing a Single Cluster

bozhu
Contributor

When you use Workflows to orchestrate standard notebooks, they can share a single cluster. It will be awesome if we can achieve the same for DLT pipelines orchestrated in a Workflows Job.

I understand DLT pipelines utilising their own special clusters, so it will still be a good solution that a DLT type cluster can be defined in a Workflows Job and this cluster is shared across different DLT pipeline tasks.

The biggest benefit of achieving this will be a significant saving of wasted individual DLT pipeline cluster start time.

2 REPLIES 2

Hubert-Dudek
Esteemed Contributor III

This is impossible, but in DLT pipeline settings, you can add multiple files to be processed in one pipeline.

I hope that the serverless version will bring potential savings in the future.

Anonymous
Not applicable

Hi @Bo Zhu​ 

Does @Hubert Dudek​  response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?

We'd love to hear from you.

Thanks!

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.