<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Databricks App how to setup lakebase postgres connection locally in Lakebase Discussions</title>
    <link>https://community.databricks.com/t5/lakebase-discussions/databricks-app-how-to-setup-lakebase-postgres-connection-locally/m-p/149740#M26</link>
    <description>&lt;P&gt;&lt;STRONG&gt;Summary&lt;/STRONG&gt; You should be able to test your FastAPI endpoints locally. Lakebase supports direct external connections via the standard PostgreSQL wire protocol, meaning your local SQLAlchemy setup can directly query the Lakebase instance without needing to be deployed to Databricks first.&lt;BR /&gt;&lt;STRONG&gt;Direct Connectivity Support:&lt;/STRONG&gt; Lakebase allows direct connections from local terminals, external tools (like DBeaver or pgAdmin), and local application development environments using standard PostgreSQL drivers like SQLAlchemy and psycopg2.&lt;BR /&gt;Authentication: When deployed, the Databricks App automatically exposes PGHOST and PGUSER and provisions a role tied to the app's service principal. For local development, you simply need to authenticate your local SQLAlchemy connection using either a Databricks OAuth token (which you can generate via the CLI/SDK) or native PostgreSQL username and password credentials.&lt;BR /&gt;&lt;STRONG&gt;Network Reachability&lt;/STRONG&gt;: Since you have already verified that the Lakebase DNS resolves and port 5432 is reachable, you are not being blocked by workspace IP ACLs or network security settings.&lt;BR /&gt;&lt;STRONG&gt;Next Steps&lt;/STRONG&gt;&amp;nbsp;Simply ensure your local environment variables mimic what the Databricks App would inject (specifically the host and user information) and pass your generated OAuth token or database password directly into your async SQLAlchemy connection string&lt;/P&gt;
&lt;P&gt;This blog entry has more detailed instruction on how to set up SQLAlchemy:&amp;nbsp;&lt;A href="https://www.databricks.com/blog/how-use-lakebase-transactional-data-layer-databricks-apps" target="_blank"&gt;https://www.databricks.com/blog/how-use-lakebase-transactional-data-layer-databricks-apps&lt;/A&gt;&lt;/P&gt;</description>
    <pubDate>Tue, 03 Mar 2026 23:18:16 GMT</pubDate>
    <dc:creator>Lu_Wang_ENB_DBX</dc:creator>
    <dc:date>2026-03-03T23:18:16Z</dc:date>
    <item>
      <title>Databricks App how to setup lakebase postgres connection locally</title>
      <link>https://community.databricks.com/t5/lakebase-discussions/databricks-app-how-to-setup-lakebase-postgres-connection-locally/m-p/149736#M25</link>
      <description>&lt;P&gt;I'm developing a FastAPI middleware app (Databricks App) that connects to both a SQL Warehouse (Unity Catalog) and a Lakebase PostgreSQL instance using async SQLAlchemy. The app works perfectly when deployed to Databricks, but I'm trying to set up local development using databricks apps run-local on Windows 11 with Python 3.11.&lt;/P&gt;&lt;P&gt;What works:&lt;/P&gt;&lt;P&gt;databricks apps run-local starts the app and proxy successfully&lt;BR /&gt;Databricks CLI authentication works (databricks-cli auth type)&lt;BR /&gt;SQL Warehouse / Unity Catalog endpoints work perfectly locally&lt;BR /&gt;Lakebase SDK calls succeed — generate_database_credential(), get_database_instance(), and current_user.me() all return valid responses&lt;BR /&gt;The Lakebase PostgreSQL DNS resolves and port 5432 is reachable&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;From my understanding, nowhere in the databricks documentation does it explain that lakebase postgres is supported for local development. Does this mean I can't use my FastAPI endpoints that use lakebase postgres tables and run them locally? Is the only way to test them when the app is deployed on databricks?&lt;BR /&gt;&lt;BR /&gt;Any feedback is appreciated.&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 03 Mar 2026 21:46:46 GMT</pubDate>
      <guid>https://community.databricks.com/t5/lakebase-discussions/databricks-app-how-to-setup-lakebase-postgres-connection-locally/m-p/149736#M25</guid>
      <dc:creator>ctgchris</dc:creator>
      <dc:date>2026-03-03T21:46:46Z</dc:date>
    </item>
    <item>
      <title>Re: Databricks App how to setup lakebase postgres connection locally</title>
      <link>https://community.databricks.com/t5/lakebase-discussions/databricks-app-how-to-setup-lakebase-postgres-connection-locally/m-p/149740#M26</link>
      <description>&lt;P&gt;&lt;STRONG&gt;Summary&lt;/STRONG&gt; You should be able to test your FastAPI endpoints locally. Lakebase supports direct external connections via the standard PostgreSQL wire protocol, meaning your local SQLAlchemy setup can directly query the Lakebase instance without needing to be deployed to Databricks first.&lt;BR /&gt;&lt;STRONG&gt;Direct Connectivity Support:&lt;/STRONG&gt; Lakebase allows direct connections from local terminals, external tools (like DBeaver or pgAdmin), and local application development environments using standard PostgreSQL drivers like SQLAlchemy and psycopg2.&lt;BR /&gt;Authentication: When deployed, the Databricks App automatically exposes PGHOST and PGUSER and provisions a role tied to the app's service principal. For local development, you simply need to authenticate your local SQLAlchemy connection using either a Databricks OAuth token (which you can generate via the CLI/SDK) or native PostgreSQL username and password credentials.&lt;BR /&gt;&lt;STRONG&gt;Network Reachability&lt;/STRONG&gt;: Since you have already verified that the Lakebase DNS resolves and port 5432 is reachable, you are not being blocked by workspace IP ACLs or network security settings.&lt;BR /&gt;&lt;STRONG&gt;Next Steps&lt;/STRONG&gt;&amp;nbsp;Simply ensure your local environment variables mimic what the Databricks App would inject (specifically the host and user information) and pass your generated OAuth token or database password directly into your async SQLAlchemy connection string&lt;/P&gt;
&lt;P&gt;This blog entry has more detailed instruction on how to set up SQLAlchemy:&amp;nbsp;&lt;A href="https://www.databricks.com/blog/how-use-lakebase-transactional-data-layer-databricks-apps" target="_blank"&gt;https://www.databricks.com/blog/how-use-lakebase-transactional-data-layer-databricks-apps&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 03 Mar 2026 23:18:16 GMT</pubDate>
      <guid>https://community.databricks.com/t5/lakebase-discussions/databricks-app-how-to-setup-lakebase-postgres-connection-locally/m-p/149740#M26</guid>
      <dc:creator>Lu_Wang_ENB_DBX</dc:creator>
      <dc:date>2026-03-03T23:18:16Z</dc:date>
    </item>
  </channel>
</rss>

