cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Quota Limit Exhausted Error when Creating declarative pipeline

crami
New Contributor II

I am trying to develop a declarative pipeline. As per platform policy, I cannot use serverless, reason, I am using asset bundle to create declarative pipeline. In the bundle, I am trying to specify compute for the pipeline. However, I am constantly facing quota exhausted exception.
Example: Below is compute config I am specifying..

crami_1-1761925275134.png 

 

Below is quota availability..
crami_0-1761925248664.png

I have tried specifying DSv2 and DSv3 family cluster, all hit the same issue...
Is there any specific configuration I may  need to specify for cluster to instantiate ?
Exception:

crami_2-1761925397717.png

 

2 REPLIES 2

Khaja_Zaffer
Contributor III

Hello @crami 

Good day!!

As the error tells. you need to increase the VM size, i know you have enough things in your place but 

spot fallback + Photon + autoscale triggers the failure. 
 
 
Go to 

Azure Portal → Subscriptions → 

Usage + quotas

Filter: Provider=Microsoft.Compute, Location=

 

Also, did you try with availability as "ON_DEMAND"?

Redeploy bundle: databricks bundle deploy Start pipeline. This might bypasses spot quota/capacity checks.

 

i am also looking for resolutions from other contributors. 

 

Thank you.

crami
New Contributor II

Thanks for pointer.

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now