cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Connecting DBeaver to Databricks Lakebase — Setup & Troubleshooting

Abhishek_sinha
New Contributor II

I recently connected DBeaver to Databricks Lakebase and wanted to share the setup steps along with a couple of troubleshooting issues I encountered.

Since Lakebase is PostgreSQL-compatible, the standard PostgreSQL driver works directly without requiring the Databricks JDBC driver.

Setup Flow

Step 1: Create a Native PostgreSQL Role

Navigate to: Lakebase Project → Branch Overview → Roles & Databases → Add Role

Set Authentication Type to Password. Avoid OAuth for desktop tools because tokens expire every hour.

Step 2: Parse the Connection String

Example connection string: postgresql://role:password@endpoint.azuredatabricks.net/databricks_postgres?sslmode=require

Map these values into DBeaver fields:

  • Host: endpoint hostname
  • Port: 5432
  • Database: databricks_postgres
  • Username: role name
  • Password: generated password (download the .env file when the role is created, it is shown only once)

Step 3: Configure DBeaver

Create a new connection using the PostgreSQL driver (not the Databricks driver).

SSL Configuration:

  • Use SSL: Enabled
  • SSL Mode: require

Issues I Encountered

Issue 1: SSL Handshake Failure

Error: PKIX path building failed — unable to find valid certification path to requested target

Fix: Go to DBeaver Preferences → Connections, uncheck "Use Windows trust store", then restart DBeaver.

Issue 2: Host Resolves but Connection Fails

DNS resolved the hostname correctly, but the endpoint was unreachable (Unknown host error).

Root cause: Private Link / VNet routing. The Lakebase endpoint was accessible only through the corporate VPN route. Confirmed with nslookup, which returned the name but no public IP — classic Private Link signature.

Why This Is Interesting

Lakebase brings transactional PostgreSQL workloads closer to the lakehouse ecosystem while aligning with Unity Catalog governance. Using familiar tools like DBeaver lowers the adoption barrier for teams already comfortable with relational database workflows.

Would love to hear if others are testing Lakebase for hybrid OLTP plus analytics use cases.

As
2 REPLIES 2

amirabedhiafi
New Contributor III

Hello @Abhishek_sinha  !

 

Thanks for sharing this ! very useful 😄 

Few things I can add (from my personal XP), it is better to use the PostgreSQL driver and not the DBKS JDBC driver because Lakebase is PostgreSQL compatible so DBeaver should be configured as a standard PostgreSQL connection. OAuth can work but it is inconvenient for desktop clients because Lakebase OAuth tokens expire after one hour. Native Postgres pwd roles do not have that hourly token expiry so they are better for DBeaver, pgAdmin, scripts and other clients that do not automatically refresh tokens. 

Last things setting sslmode=require is important and for the PKIX / certificate issue you need to disabling "use Windows trust store" in some envs but in managed corporate machines the cleaner fix may be to import the corporate CA certificate into the Java trust store.m)

If this answer resolves your question, could you please mark it as “Accept as Solution”? It will help other users quickly find the correct fix.

Senior BI/Data Engineer | Microsoft MVP Data Platform | Microsoft MVP Power BI | Power BI Super User | C# Corner MVP

Thanks for adding these insights — really valuable points from practical experience. 😊

 

Completely agree that using the native PostgreSQL driver makes much more sense for Lakebase compared to the Databricks JDBC driver, especially for tools like DBeaver and pgAdmin. The OAuth token expiry limitation is something many people only discover after setup, so your explanation around native Postgres roles is super helpful.

 

Also appreciate the note on sslmode=require and the PKIX/certificate handling. Corporate trust store issues can definitely be tricky, and your suggestion about importing the CA certificate into the Java trust store is a great long-term fix for managed environments.

 

Thanks again for sharing such detailed observations!

As