cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

JoeWMP
by New Contributor III
  • 1060 Views
  • 1 replies
  • 7 kudos

All-purpose compute clusters that are attached to a pool are no longer able to switch to a different pool/change to a non-pool worker/driver.

Would like to know if anyone else is experiencing this - we're seeing this across 5+ different Databricks workspaces in both AWS and Azure.Reproduction: Create all purpose compute cluster, attach it to existing pool, save and start cluster. Edit clus...

image
  • 1060 Views
  • 1 replies
  • 7 kudos
Latest Reply
JoeWMP
New Contributor III
  • 7 kudos

We're also seeing the same behavior when trying to change the pool on an all-purpose cluster using Terraform and Databricks Labs Terraform provider as well. The Terraform apply will go through and say the cluster was updated to the new pool id, but t...

  • 7 kudos
mmlime
by New Contributor III
  • 1875 Views
  • 4 replies
  • 0 kudos

Resolved! Can I use VMs from Pool for my Workflow cluster?

Hi,there is no option to take VMs from a Pool for a new workflow (Azure Cloud)?default schema for a new cluster:{ "num_workers": 0, "spark_version": "10.4.x-scala2.12", "spark_conf": { "spark.master": "local[*, 4]", "spark...

  • 1875 Views
  • 4 replies
  • 0 kudos
Latest Reply
Vivian_Wilfred
Honored Contributor
  • 0 kudos

@Michal Mlaka​ I just checked on the UI and I could find the pools listing under worker type in a job cluster configuration. It should work.

  • 0 kudos
3 More Replies
Labels