cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

DLT Compute: "Ephemeral" Job Compute vs. All-purpose compute 2.0 ... WHY?

ChristianRRL
Contributor III

Hi there, this is a follow-up from a discussion I started last month

Solved: Re: DLT Compute: "Ephemeral" Job Compute vs. All-p... - Databricks Community - 71661

Based on what was discussed, I understand that it's not possible to use "All Purpose Clusters" with DLT Pipelines. I would like to understand WHY this is the case? I'm not sure I follow why Databricks wouldn't allow this as a possible implementation since the "Ephemeral" Job Compute clusters effectively always cost more since they require spinning up new resources when we already have All Purpose Clusters up & running.

Is there something I'm missing here?

1 ACCEPTED SOLUTION

Accepted Solutions

raphaelblg
Databricks Employee
Databricks Employee

@ChristianRRL regarding on why DLT doesn't allow you to use all-purpose clusters:

1. The DLT runtime is derived from the shared compute DBR, it's not the same runtime and has different features than the common all-purpose runtime. A DLT pipeline is not capable of being executed in any of the all-purpose cluster runtimes.

2. DLT is a different product than all-purpose compute, with different prices. 

Feel free to use our Pricing Calculator to compare prices. At the current moment, if you run the exact same workload, with the same driver and workers instance types (and workers amount) on DLT, it should bill you with less DBUs than on all-purpose.

 

Best regards,

Raphael Balogo
Sr. Technical Solutions Engineer
Databricks

View solution in original post

3 REPLIES 3

ChristianRRL
Contributor III

Good morning @Retired_mod, I think most of these points make sense, particularly running pipelines in a "fully isolated environment". I can understand that this can be a best practice (or in this case only practice) allowed by Databricks, but I'm still somewhat confused as to why there isn't at least an option to leverage the all-purpose clusters with DLT jobs (even if just as a non-default option). Out of curiosity, do you know if there's been any sort of discussion in Databricks to making this possible in the future?

Additionally, with respect to point (5) with data analytics (all-purpose) clusters and the job workloads being subject to "different pricing" than the data engineering (task) workloads, how might I best compare/contrast pricing between these two? For example, at the moment DLT is effectively *only* adding costs since our existing setup assumes that the all-purpose clusters are in a sense "set in stone" and any additional compute such as the task job clusters cost more since they are not using our existing all-purpose clusters. Maybe if we had a better idea as to what kind of cost savings we may get with DLT job clusters compared with all-purpose clusters, we may be able to shift some compute load out of all-purpose and more concretely save on costs rather than just adding to it.

@Retired_mod / @raphaelblg quick follow-up on this one. Wondering if anyone can provide a bit more feedback on the last points I wrote.

raphaelblg
Databricks Employee
Databricks Employee

@ChristianRRL regarding on why DLT doesn't allow you to use all-purpose clusters:

1. The DLT runtime is derived from the shared compute DBR, it's not the same runtime and has different features than the common all-purpose runtime. A DLT pipeline is not capable of being executed in any of the all-purpose cluster runtimes.

2. DLT is a different product than all-purpose compute, with different prices. 

Feel free to use our Pricing Calculator to compare prices. At the current moment, if you run the exact same workload, with the same driver and workers instance types (and workers amount) on DLT, it should bill you with less DBUs than on all-purpose.

 

Best regards,

Raphael Balogo
Sr. Technical Solutions Engineer
Databricks

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group