cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

How do I move existing workflows and jobs running on an all-purpose cluster to a shared jobs cluster?

MadelynM
New Contributor III
New Contributor III

A Databricks cluster is a set of computation resources that performs the heavy lifting of all of the data workloads you run in Databricks. Databricks provides a number of options when you create and configure clusters to help you get the best performance at the lowest cost.

This post will help you switch your jobs running on an all-purpose cluster to a shared jobs cluster. Job clusters help you by reducing resource usage and cost.

Pre-req

  • You have permissions to create and manage clusters in your Databricks workspace - Cluster overview AWS, Azure, GCP
  • You have Jobs running – Jobs Quickstart AWS, Azure, GCP

Steps to move existing jobs and workflows

  1.  Navigate to the Data Science & Engineering homepageLeft navigation bar selecting Data Science & Engineering
  2.   Click on WorkflowsLeft nav Workflows selected
  3.  Click on a Job Name and find the Compute in the left panelScreen Shot 2022-07-05 at 10.24.37 AM
  4. Click the Swap button
  5. Select an existing Jobs Cluster (if available) or click `New job cluster` to create a new Jobs ClusterScreen Shot 2022-07-05 at 10.24.46 AM

You can also use the Jobs API. The job_clusters object lets you partially update a list of job cluster specifications that can be shared and reused by tasks of this job.

Learn more

Drop your questions, feedback and tips below! 👇

3 REPLIES 3

Prabakar
Esteemed Contributor III
Esteemed Contributor III

You can refer here for additional information.

Anonymous
Not applicable

@Doug Harrigan​ Thanks for your question! @Prabakar Ammeappin​ linked above to our Docs page that mentions a bit more about the recent (April) version update/change:

"This release fixes an issue that removed the Swap cluster button from the Databricks jobs user interface when the assigned cluster is unavailable. You can now assign a new cluster to a job in the UI when the configured cluster is unavailable, for example, because of a network change."

I am not sure that entirely answers your "methodology" question above, but let us know! Hope to hear back soon.

Kaniz_Fatma
Community Manager
Community Manager

Hi @Madelyn Mullen​ , Thank you for sharing such an excellent and informative post. We hope to see these very often.

Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!