cancel
Showing results for 
Search instead for 
Did you mean: 
Lakebase Discussions
Ask questions, share challenges, and connect with others working on Lakebase. From troubleshooting to best practices, this is where conversations happen.
cancel
Showing results for 
Search instead for 
Did you mean: 

Databricks App how to setup lakebase postgres connection locally

ctgchris
New Contributor III

I'm developing a FastAPI middleware app (Databricks App) that connects to both a SQL Warehouse (Unity Catalog) and a Lakebase PostgreSQL instance using async SQLAlchemy. The app works perfectly when deployed to Databricks, but I'm trying to set up local development using databricks apps run-local on Windows 11 with Python 3.11.

What works:

databricks apps run-local starts the app and proxy successfully
Databricks CLI authentication works (databricks-cli auth type)
SQL Warehouse / Unity Catalog endpoints work perfectly locally
Lakebase SDK calls succeed — generate_database_credential(), get_database_instance(), and current_user.me() all return valid responses
The Lakebase PostgreSQL DNS resolves and port 5432 is reachable


From my understanding, nowhere in the databricks documentation does it explain that lakebase postgres is supported for local development. Does this mean I can't use my FastAPI endpoints that use lakebase postgres tables and run them locally? Is the only way to test them when the app is deployed on databricks?

Any feedback is appreciated.


1 REPLY 1

Lu_Wang_ENB_DBX
Databricks Employee
Databricks Employee

Summary You should be able to test your FastAPI endpoints locally. Lakebase supports direct external connections via the standard PostgreSQL wire protocol, meaning your local SQLAlchemy setup can directly query the Lakebase instance without needing to be deployed to Databricks first.
Direct Connectivity Support: Lakebase allows direct connections from local terminals, external tools (like DBeaver or pgAdmin), and local application development environments using standard PostgreSQL drivers like SQLAlchemy and psycopg2.
Authentication: When deployed, the Databricks App automatically exposes PGHOST and PGUSER and provisions a role tied to the app's service principal. For local development, you simply need to authenticate your local SQLAlchemy connection using either a Databricks OAuth token (which you can generate via the CLI/SDK) or native PostgreSQL username and password credentials.
Network Reachability: Since you have already verified that the Lakebase DNS resolves and port 5432 is reachable, you are not being blocked by workspace IP ACLs or network security settings.
Next Steps Simply ensure your local environment variables mimic what the Databricks App would inject (specifically the host and user information) and pass your generated OAuth token or database password directly into your async SQLAlchemy connection string

This blog entry has more detailed instruction on how to set up SQLAlchemy: https://www.databricks.com/blog/how-use-lakebase-transactional-data-layer-databricks-apps