cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Community Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

How to access storage with private endpoint

jx1226
New Contributor II

We know that Databricks with VNET injection (our own VNET) allows is to connect to blob storage/ ADLS Gen2 over private endpoints and peering. This is what we typically do.

  • We have a client who created Databricks with EnableNoPublicIP=No (secure cluster connectivity) and VnetInjection=No. So itโ€™s using a managed VNET in the Databricks managed resource group and expose with public IP. Weโ€™re wondering if we still can make it connect to blob storage/ ADLS Gen2 over private endpoints. Or do we need to delete and recreate the Databricks workspace with VNET injection?
  • We want use Oauth2 with Service Principal with Storage Blob Data Contributor as role set on the blob storage/ ADLS Gen2.
  • We want to mount in Workspace with Service Principal credentials
  • In customer Workspace, UC is not activated, no possibility via UC access connector.
  • So basically my question is can we use this workspace setup EnableNoPublicIP=No and VnetInjection=No to access storage with private endpoint use mounting?
2 REPLIES 2

Kaniz_Fatma
Community Manager
Community Manager

Hi @jx1226 , Certainly! Letโ€™s break down the situation and address your questions:

  1. Workspace Configuration:

    • Your clientโ€™s Databricks workspace is currently set up with the following parameters:
      • EnableNoPublicIP=No (secure cluster connectivity)
      • VnetInjection=No (using a managed VNET in the Databricks managed resource group and exposed with a public IP).
    • The question is whether this configuration allows connecting to blob storage or ADLS Gen2 over private endpoints.
  2. Private Link and Workspace Connectivity:

    • Private Link provides a way to establish private connectivity between Azure VNets and Azure services without exposing traffic to the public network.
    • Azure Databricks supports two types of Private Link connections:
      • Front-end Private Link (User to Workspace): This allows users to connect to the Azure Databricks web application, REST API, and Databricks Connect API over a VNet interface endpoint. Itโ€™s also used for JDBC/ODBC and PowerBI integrations.
      • Back-end Private Link (Compute Plane to Control Plane): Databricks Runtime clusters in a customer-managed VNet (compute plane) connect to an Azure Databricks workspaceโ€™s core services (control plane) in the Azure Databricks cloud account. This enables private connectivity from the clusters to the secure cluster connectivity relay endpoint and REST API endpoint.
  3. VNet Injection Requirement:

  4. Mounting with Service Principal Credentials:

    • You mentioned using OAuth2 with a Service Principal having the Storage Blob Data Contributor role on the blob storage or ADLS Gen2.
    • You can still mount the storage in your workspace using Service Principal credentials, even with the current workspace setup.
    • Ensure that the Service Principal has the necessary permissions to access the storage resources.
  5. User-Managed VNet and UC Access Connector:

    • Since your customer workspace does not have User-Managed VNet (UC) activated, you wonโ€™t be able to use UC access connectors.
    • However, this doesnโ€™t impact the ability to use private endpoints for storage.
  6. Conclusion:

    • In summary, you can use the existing workspace setup (with EnableNoPublicIP=No and VnetInjection=No) to access storage with private endpoints using mounting.
    • Just make sure to configure the Service Principal correctly and ensure that the VNet meets the requirements for VNet injection.

Feel free to proceed with this configuration, and if you encounter any issues, let us know.

IvoDB
New Contributor II

Hey @jx1226 , were you able to solve this at the customer? I am currently struggling with same issues here.

Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!