06-04-2025 02:27 AM
When working with Azure Databricks (with VNET injection) to connect securely to an Azure Storage account via private endpoint, there's a few locations it needs to connect to, firstly the vnet that databricks is connected to, which works well when connecting with blob client in a notebook.
Then connecting to serverless compute following this guide: Configure private connectivity from serverless compute - Azure Databricks | Microsoft Learn, which doesn't seem to work, as then performing the same operation errors with `This request is not authorized to perform this operation.`
But most importantly, none of this touches on the control plane, which Unity Catalog uses for external locations, and I can't seem to find documentation on how to create a private endpoint to the control plane at all?
Is there any guidance on how to create an external location using private endpoint Azure Storage account? Trying to create it ends with error: `Failed to access cloud storage: [AbfsRestOperationException] () exceptionTraceId=XXX`, and now other details.
06-04-2025 04:26 AM
dfs
and blob
, required for operations.curl
or nslookup
to confirm private endpoints and network configurations are operational.06-04-2025 11:47 AM
I've read that in the documentation, and when I now tried with an Access Connector for Azure Databricks instead of my own service principal, it seems to have worked, shockingly, even if I completely block network access on the storage account with zero private endpoints. No idea how, but for anyone coming across this, the solution is to use an Access Connector for Azure Databricks.
06-04-2025 04:26 AM
dfs
and blob
, required for operations.curl
or nslookup
to confirm private endpoints and network configurations are operational.06-04-2025 11:47 AM
I've read that in the documentation, and when I now tried with an Access Connector for Azure Databricks instead of my own service principal, it seems to have worked, shockingly, even if I completely block network access on the storage account with zero private endpoints. No idea how, but for anyone coming across this, the solution is to use an Access Connector for Azure Databricks.
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now