โ02-22-2024 11:31 PM
Hi team,
When I create a DLT job, is there a way to control the cluster runtime version somewhere? E.g. I want to use 14.3 LTS. I tried to add `"spark_version": "14.3.x-scala2.12",` inside cluster default label but not work.
Thanks
โ02-22-2024 11:54 PM
โ02-23-2024 10:26 AM
Thanks. I mean running DLT, not run a cell from notebook sourced for DLT.
โ02-26-2024 05:48 PM
Hi Brad,
I don't think you can because if you look at a recent DBRS article here (that apparently it won't let me link) it says in the "DLT Compute" section: "DLT will manage and optimize the node type and DBR selection, ensuring the best choice of nodes and the latest DBR runtime are selected, reducing the management overhead for users. "
Thanks.
โ02-26-2024 07:17 PM
Hello @Brad ,
This page listed cluster attributes that you cannot change and why, including spark_version.
โ02-27-2024 03:20 PM
Thanks. Got it.
And the cluster has to be share mode. Can different DLT jobs share clusters or when DLT job is running, can other people use the cluster? Seems each DLT job running will start a new cluster. If it is not be able to shared, why it has to be share mode?
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group