- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-18-2025 08:38 AM - edited 08-18-2025 08:40 AM
Hi!
I have some issues with setting up my workspace and storage account within a virtual network and letting them connect to each other.
Background set up (all done in terraform):
- Databricks workspace with VNet Injection and Unity Catalog enabled. (3 subnets: 2 delegated for databricks workspace, 1 subnet for private endpoint to do front-end private link)
- One azure storage account (public access disabled) with a private endpoint that uses a new subnet within the same vnet, (subresource type 'blob')
- Databricks Access Connector with Storage Blob Data Contributor role assigned to the storage account
- An external location using the access connector as storage credential, pointing to the storage account and used as root for my catalog
Issue:
I get the following errors when starting up my dlt (not serverless) or running a query on a sql warehouse that writes data to my catalog:
- Failed to initialize the UCContext
- DLT ERROR CODE: EXECUTION_SERVICE_STARTUP_FAILURE.AZURE_STORAGE_PERMISSION_ISSUE
- 403, This Azure storage request is not authorized. The storage account's 'Firewalls and virtual networks' settings may be blocking access to storage services. Please verify your Azure storage credentials or firewall exception settings.
It seems there is still an issue with my compute not being able to connect to my storage account.
Does anyone know how to resolve this? What am I missing?
It seems like my clusters are not using the private endpoint to connect to the storage account.
Thank you in advance.