cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Failed to launch pipeline cluster

SimonXu
New Contributor II

cluster failed to startusage and quotaHi, there. I encountered an issue when I was trying to create my delta live table pipeline.

The error is "DataPlaneException: Failed to launch pipeline cluster 1202-031220-urn0toj0: Could not launch cluster due to cloud provider failures. azure_error_code: OperationNotAllowed, azure_error_message: Operation could not be completed as it results in exceeding approved standardFSF...".

However, when I checked my azure subscription, it showed that I had much enough quota space. I don't how to fix this issue as I'm new to the delta live table.

1 ACCEPTED SOLUTION

Accepted Solutions

Anonymous
Not applicable

Hi @Simon Xu​,

You're not alone.

I encountered this issue before. The issue comes from Azure site not Databricks.

You need to check the number of cores, ram, CPU in your Warehouse cluster then compare the resources in Azure resource group hosted Databricks workspace then if you don't have

enough resource you need to increase the quota to higher number.

Image ImageFeel free to ask me if you have any questions 😀

BR,

Jensen Nguyen

View solution in original post

6 REPLIES 6

UmaMahesh1
Honored Contributor III

Anonymous
Not applicable

Hi @Simon Xu​,

You're not alone.

I encountered this issue before. The issue comes from Azure site not Databricks.

You need to check the number of cores, ram, CPU in your Warehouse cluster then compare the resources in Azure resource group hosted Databricks workspace then if you don't have

enough resource you need to increase the quota to higher number.

Image ImageFeel free to ask me if you have any questions 😀

BR,

Jensen Nguyen

SimonXu
New Contributor II

Thanks, Jensen. It works for me!

Adedotun
New Contributor II

Unfortunately, I just encountered this error too, and followed your solution but it's still not working. My Usage + quota on Azure is 4 out of 10 (6) and the required DBs compute is 4 cores. However in my case, I used a single node. I strongly suspect I have to switch to a multi-node cluster, and then request for an increase in cores from Azure. I'll be back with an update!

arpit
Contributor II
Contributor II

@Simon Xu​ 

I suspect that DLT is trying to grab some machine types that you simply have zero quota for in your Azure account. By default, below machine type gets requested behind the scenes for DLT:

AWS: c5.2xlarge

Azure: Standard_F8s

GCP: e2-standard-8

You can also set them explicitly from here.

Regards,

Arpit Khare

Masoomeh
New Contributor II

I followed @arpit suggestion and set the cluster configuration explicitly in the JSON file and solved the issue.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.