- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-04-2026 10:42 PM - edited 02-04-2026 10:44 PM
Hello,
when I try setting up a model with provisioned throughput, the deployment fails with the message “Provisioned throughput is not enabled for this workspace.”. It doesn’t work for databricks-hosted models nor for 3rd party models from the marketplace. I also cannot create models using GPU (I do have GPU quota on Azure and I was in fact able to launch a T4 GPU ML instance on Databricks) and I receive a similar message.
It is a premium account in the eastus2 region which in principle allows for provisioned throughput model serving and I believe I have set all knobs correctly (no geo enforcement in workspace settings, no compliance requirements, sufficient user rights). I am not aware of any additional “knobs” / settings to check.
What is the bottleneck?
- Labels:
-
Model Serving
-
provisioned throughput