cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Lakebase in an enterprise setup

Lode
Databricks Partner

Hi,

Has anyone managed to get the new Lakebase autoscaling fully working in an enterprise Azure setup?

We are currently facing issues when setting up Lakebase autoscaling in a Databricks environment without a public IP, where all traffic is routed privately. We followed the Databricks documentation and configured private endpoints for service direct.

Our Databricks compute can successfully connect to Lakebase using a connection string, and the same applies from machines on our office network. So overall, connectivity is working. However, the problem appears specifically in the Lakebase UI.

When opening the tables view or using the SQL editor in the Lakebase view within the Databricks workspace, the traffic seems to be routed through a non-private endpoint.

What is working:

  • Accessing Lakebase from notebooks on shared clusters
  • Accessing Lakebase from serverless notebooks
  • Accessing Lakebase from our office network
  • UI features such as branching, creating credentials, and spinning up new Lakebase projects

What is not working:

  • Tables view and SQL editor in the Lakebase UI

From browser inspection, we see a 403 error on a POST request to:
https://api.database.westeurope.azuredatabricks.net/sql

I have attached:

  1. The error message from the Databricks workspace (tables view)
  2. Network requests from Chrome DevTools showing the failing call

Any ideas what could be missing or misconfigured?

1 REPLY 1

emma_s
Databricks Employee
Databricks Employee

Hi,  so I think what is happening as different types of tools get routed to lakebase via different paths and I believe you need to open up an additional private link. The docs here explain it pretty well. https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/private-link

I suspect you already have port 443 opened, as this is needed for general Databricks features which is why the things that work do work. But you also need the Inbound Private Link for performance-intensive services on 5432. Here is the doc https://learn.microsoft.com/en-us/azure/databricks/security/network/front-end/service-direct-private... note this in public preview so you would need to make sure it's enabled. 

I hope this helps.


Thanks,

Emma