07-05-2022 10:32 AM
A Databricks cluster is a set of computation resources that performs the heavy lifting of all of the data workloads you run in Databricks. Databricks provides a number of options when you create and configure clusters to help you get the best performance at the lowest cost.
This post will help you switch your jobs running on an all-purpose cluster to a shared jobs cluster. Job clusters help you by reducing resource usage and cost.
Pre-req
Steps to move existing jobs and workflows
You can also use the Jobs API. The job_clusters object lets you partially update a list of job cluster specifications that can be shared and reused by tasks of this job.
Learn more
Drop your questions, feedback and tips below! 👇
07-06-2022 04:52 AM
You can refer here for additional information.
07-06-2022 04:06 PM
@Doug Harrigan Thanks for your question! @Prabakar Ammeappin linked above to our Docs page that mentions a bit more about the recent (April) version update/change:
"This release fixes an issue that removed the Swap cluster button from the Databricks jobs user interface when the assigned cluster is unavailable. You can now assign a new cluster to a job in the UI when the configured cluster is unavailable, for example, because of a network change."
I am not sure that entirely answers your "methodology" question above, but let us know! Hope to hear back soon.
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now