02-04-2026 10:42 PM - edited 02-04-2026 10:44 PM
Hello,
when I try setting up a model with provisioned throughput, the deployment fails with the message “Provisioned throughput is not enabled for this workspace.”. It doesn’t work for databricks-hosted models nor for 3rd party models from the marketplace. I also cannot create models using GPU (I do have GPU quota on Azure and I was in fact able to launch a T4 GPU ML instance on Databricks) and I receive a similar message.
It is a premium account in the eastus2 region which in principle allows for provisioned throughput model serving and I believe I have set all knobs correctly (no geo enforcement in workspace settings, no compliance requirements, sufficient user rights). I am not aware of any additional “knobs” / settings to check.
What is the bottleneck?
2 weeks ago
Hello Steve,
thank you for the broad reply. Just FYI I had to reach to to Azure support and DB support to fix it.
FYI 2: Serverless compute was enabled, `Enforce data processing within workspace Geography for Designated Services was disabled`, pay-per-token inference was working and account is premium. I don't really know what the issue was, but Azure & DB fixed it.
Anyways, thank you!
Samuel
02-08-2026 09:05 PM
It means your workspace doesn’t have the required entitlements for provisioned throughput (and likely GPU serving) enabled, even though the region and account tier are compatible. Your Databricks account team must flip those feature flags .This entitlement flip is done by Databricks (it’s not a self-service workspace toggle).
02-16-2026 12:55 AM
Hello Pradeep,
thank you for your response.
Sorry for the stupid question, but how do I actually get in touch with anybody at Databricks? I tried over Azure, but I only get to some template forms were none really addresses my issue. I could maybe subscribe to the $30 monthly support program, but it feels absurd, given the only thing I need is enabling a service for which I will have to pay anyway.
For some reason I do not have access to Databricks' help center and help@databricks.com is dead unresponsive.
I am no big enterprise client, but I do rack up a monthly bill with Databricks, so having to jump through all this hoops merely to find a communication channel feels very frustrating and annoying.
02-17-2026 04:04 AM
Can you try following -
https://docs.databricks.com/aws/en/workspace/navigate-workspace#get-help
https://docs.databricks.com/aws/en/resources/support
https://www.databricks.com/company/contact
Support based on your tier .
https://www.databricks.com/support-with-enhanced
Let me know If this doesn't help .
3 weeks ago
Hi @samuel86,
I can see from the thread that you have already done quite a bit of due diligence here -- Premium workspace, eastus2 region, geo enforcement disabled, and sufficient Azure GPU quota. Let me help you work through this systematically.
Your region (eastus2) does fully support provisioned throughput, GPU model serving, and Foundation Model APIs on Azure Databricks, so the region is not the issue. Here is what to check next:
VERIFY SERVERLESS COMPUTE IS ENABLED
Provisioned throughput requires serverless compute. On Azure Databricks, serverless compute has these requirements:
1. Unity Catalog must be enabled on the workspace. If your workspace was created without Unity Catalog, you need to enable it first. You can verify by checking whether you see the "Catalog" icon in the left sidebar.
2. The workspace must not have PCI-DSS enabled in the compliance security profile.
To confirm serverless compute is working, try creating a Serverless SQL Warehouse (go to SQL Warehouses, click Create, and select "Serverless" as the type). If that works, serverless compute is enabled. If it fails, that is the root cause.
CHECK ACCOUNT CONSOLE SETTINGS
As a workspace admin, go to the Databricks Account Console at https://accounts.azuredatabricks.net/ and check:
1. Click on your workspace name, then the "Security and compliance" tab.
2. Confirm that "Enforce data processing within workspace Geography for Designated Services" is disabled (you mentioned you have already done this -- just double-checking).
VERIFY YOUR WORKSPACE TIER
You mentioned a "premium account" which is great. Just to confirm, navigate to Admin Settings in your workspace and check that the pricing tier shows "Premium." Provisioned throughput is not available on Standard tier or the free trial/Community Edition.
TRY THE PAY-PER-TOKEN ENDPOINTS FIRST
While troubleshooting, you can verify that Model Serving works at all by trying the pay-per-token Foundation Model APIs. These are preconfigured endpoints that appear automatically in the Serving tab (left sidebar). Look for endpoints like "databricks-meta-llama-3-3-70b-instruct" or similar at the top of the Serving endpoints list. If pay-per-token endpoints work but provisioned throughput does not, that narrows the problem significantly.
IF EVERYTHING CHECKS OUT BUT THE ERROR PERSISTS
If all the above settings are correct and you still receive the "Provisioned throughput is not enabled for this workspace" error, then there may be an account-level entitlement that needs to be activated. Here are the best ways to reach the Databricks team for this:
1. In-workspace help: Click the "?" icon in the top-right corner of your Databricks workspace, then select "Support." This opens the Databricks Help Center directly from within the workspace and should let you file a support request.
2. Azure support: Since you are on Azure, you can also open a support request through the Azure Portal. Go to the Azure Portal, find your Azure Databricks resource, and click "New support request." Azure routes Databricks-specific tickets to the Databricks support team.
3. Contact form: If the above options do not work, the contact form at https://www.databricks.com/company/contact can connect you with a representative.
4. This community: You can also tag Databricks Community Managers here who may be able to escalate on your behalf.
RELEVANT DOCUMENTATION
- Foundation Model APIs overview: https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/
- Deploy provisioned throughput: https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/deploy-pro...
- Azure region support for model serving: https://learn.microsoft.com/en-us/azure/databricks/resources/feature-region-support
- Serverless compute requirements: https://learn.microsoft.com/en-us/azure/databricks/compute/serverless/
- Cross-Geo processing settings: https://learn.microsoft.com/en-us/azure/databricks/resources/databricks-geos
I hope this helps narrow down the cause. Please let us know what you find when running through the checklist above and we can help dig further.
* This reply used an agent system I built to research and draft this response based on the wide set of documentation I have available and previous memory. I personally review the draft for any obvious issues and for monitoring system reliability and update it when I detect any drift, but there is still a small chance that something is inaccurate, especially if you are experimenting with brand new features.
2 weeks ago
Hello Steve,
thank you for the broad reply. Just FYI I had to reach to to Azure support and DB support to fix it.
FYI 2: Serverless compute was enabled, `Enforce data processing within workspace Geography for Designated Services was disabled`, pay-per-token inference was working and account is premium. I don't really know what the issue was, but Azure & DB fixed it.
Anyways, thank you!
Samuel