04-08-2022 06:51 AM
I'm trying to run through the Delta Live Tables quickstart example on Azure Databricks. When trying to start the pipeline I get the following error:
Failed to launch pipeline cluster 0408-131049-n3g9vr4r: The operation could not be performed on your account with the following error message: azure_error_code: OperationNotAllowed, azure_error_message: Operation could not be completed as it results in...
This is the full message as it appears in the Pipeline Event Log Details (both Summary and JSON version). As far as I have been able to find out Azure throws this error in case spinning up the cluster would exceed the vCPU quota. However, in the Azure usage overview none of vCPUs seem near the quota. Is there some way of seeing the entire error message? This would at least help me establish about which vCPUs this is as I cannot seem to find out which are being used by the pipeline cluster.
05-06-2022 06:07 AM
This was indeed caused by databricks using a vcpu type that was at its quota. To solve this add an explicit vcpu type to settings.json:
"clusters": [
{
"label": "default",
"node_type_id": "Standard_DS3_v2",
"driver_node_type_id": "Standard_DS3_v2",
}
],
Note that the UI version of the settings doesn't seem the support changing this hence the need to go into the json version.
05-26-2022 03:52 AM
Hi @Kaniz Fatma ,
I am using 14 day free trial for Databricks on Azure platform.
I am getting same error. What can I do?
Failed to launch pipeline cluster 0526-095900-zd8jcs62: The operation could not be performed on your account with the following error message: azure_error_code: OperationNotAllowed, azure_error_message: Operation could not be completed as it results in...
Thanks,
Devashish
05-06-2022 06:00 AM
I am also receiving this error and am a premium customer.
@Kaniz Fatma for the win! Your post got me to look at my current quotas in Azure, and I was at limit for the CPU's chosen for processing Delta, so I increased the quota request and re-started the pipeline and everything worked! Thanks
05-06-2022 06:07 AM
This was indeed caused by databricks using a vcpu type that was at its quota. To solve this add an explicit vcpu type to settings.json:
"clusters": [
{
"label": "default",
"node_type_id": "Standard_DS3_v2",
"driver_node_type_id": "Standard_DS3_v2",
}
],
Note that the UI version of the settings doesn't seem the support changing this hence the need to go into the json version.
07-14-2022 12:43 AM
I am having this same issue. My quota seems fines and have tried setting the json to an explicit vcpu to no avail. I am on Premium.
Any ideas?
08-02-2022 10:41 AM
Same here. Banging my head over here.
10-13-2022 01:26 AM
Hi there, I'm still facing this issue with Azure Databricks. My quotas look alright. Is there anything else that I have to check? Has this been answered elsewhere? Please let me know more. TIA!
01-31-2023 01:54 AM
Hi - just in case anyone else is still experiencing this issue - please see below how I fixed this...
Go to your Azure Portal 'Activity log' and then look for any errors whilst running the Databricks Pipeline...
I received this error in my Activity Log: "Create or Update Virtual Machine - Failed" Operation could not be completed as it results in exceeding the approved standardFSFamily Cores quota. Additional details - Deployment Model: Resource Manager, Location: uksouth, Current Limit: 10, Current Usage: 8, Additional Required: 8, (Minimum) New Limit Required: 16. Submit a request for Quota increase at [***URL***] by specifying parameters listed in the ‘Details’ section for deployment to succeed. Please read more about quota limits at https://disq.us/url?url=https%3A%2F%2Fdocs.microsoft.com%2Fen-us%2Fazure%2Fazure-supportability%2Fper-vm-quota-requests%3AvJyPQQt21s9--6dqZy4z2vOT8JA&cuid=5011031
I then requested an increase for the following quotas: -
Standard FS Family vCPUs increased from 10 to 50
Standard DSv2 Family vCPUs increased from 10 to 50
And now re-run the pipeline - hopefully, this fixes the issue for you too 🙂
04-24-2023 10:03 AM
This communication really helped me. I am now successfully able to execute DLT pipeline. Thanks to all contributor.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group