cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Failed to launch pipeline cluster

SimonXu
New Contributor II

cluster failed to startusage and quotaHi, there. I encountered an issue when I was trying to create my delta live table pipeline.

The error is "DataPlaneException: Failed to launch pipeline cluster 1202-031220-urn0toj0: Could not launch cluster due to cloud provider failures. azure_error_code: OperationNotAllowed, azure_error_message: Operation could not be completed as it results in exceeding approved standardFSF...".

However, when I checked my azure subscription, it showed that I had much enough quota space. I don't how to fix this issue as I'm new to the delta live table.

1 ACCEPTED SOLUTION

Accepted Solutions

Anonymous
Not applicable

Hi @Simon Xu​,

You're not alone.

I encountered this issue before. The issue comes from Azure site not Databricks.

You need to check the number of cores, ram, CPU in your Warehouse cluster then compare the resources in Azure resource group hosted Databricks workspace then if you don't have

enough resource you need to increase the quota to higher number.

Image ImageFeel free to ask me if you have any questions 😀

BR,

Jensen Nguyen

View solution in original post

6 REPLIES 6

UmaMahesh1
Honored Contributor III

Hi @Simon Xu​ 

I hope this thread might solve your issue..

https://learn.microsoft.com/en-us/answers/questions/195003/databricks-cluster-error-code-operationno...

Cheers

Uma Mahesh D

Anonymous
Not applicable

Hi @Simon Xu​,

You're not alone.

I encountered this issue before. The issue comes from Azure site not Databricks.

You need to check the number of cores, ram, CPU in your Warehouse cluster then compare the resources in Azure resource group hosted Databricks workspace then if you don't have

enough resource you need to increase the quota to higher number.

Image ImageFeel free to ask me if you have any questions 😀

BR,

Jensen Nguyen

SimonXu
New Contributor II

Thanks, Jensen. It works for me!

Adedotun
New Contributor II

Unfortunately, I just encountered this error too, and followed your solution but it's still not working. My Usage + quota on Azure is 4 out of 10 (6) and the required DBs compute is 4 cores. However in my case, I used a single node. I strongly suspect I have to switch to a multi-node cluster, and then request for an increase in cores from Azure. I'll be back with an update!

arpit
Valued Contributor

@Simon Xu​ 

I suspect that DLT is trying to grab some machine types that you simply have zero quota for in your Azure account. By default, below machine type gets requested behind the scenes for DLT:

AWS: c5.2xlarge

Azure: Standard_F8s

GCP: e2-standard-8

You can also set them explicitly from here.

Regards,

Arpit Khare

Masoomeh
New Contributor II

I followed @arpit suggestion and set the cluster configuration explicitly in the JSON file and solved the issue.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group