How to create a lakebase table ?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-01-2025 09:15 PM
Hi databricks community,
I want to create a lakebase table that is synced with the delta table . So whenever the delta table is updated the changes should be available in lakebase table. Now I want to create a databricks streamlit application and make the app to fetch data from the lakebase table that is synced to delta table in unity catalogue. Is it possible ? How to proceed with implementation
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-01-2025 11:47 PM
Hi @Gokkul007 ,
Yes, it possible. Below you can find docs for lakebase. First read get started section:
OLTP databases - Azure Databricks | Microsoft Learn
Next, read a section that will teach you how to sync your existing delta tables to lakebase
Lastly, there are plenty of resources that show you how to integrate databricks apps with lakebase:
How to use Lakebase as a transactional data layer for Databricks Apps | Databricks Blog
How to Build Databricks Apps with Lakebase in Minutes
Add a Lakebase resource to a Databricks app | Databricks on AWS
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-05-2025 10:06 AM
Yes, it’s possible to have a Lakehouse table synced with a Delta table in Unity Catalog. You have a few options:
Direct read: Register the Delta table in Unity Catalog and query it directly from your Streamlit app.
Delta Live Tables (DLT): Create a DLT pipeline that reads the Delta table and writes a managed Lakehouse table that automatically updates on changes.
Materialized view: Create a SQL materialized view on the Delta table and refresh it periodically.
For production with transformations, use a DLT pipeline; for simple dashboards, direct Delta read works. Streamlit can connect via Databricks SQL Connector or JDBC/ODBC to fetch the latest data.
This ensures your app always sees the updated data in near real-time.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
2 weeks ago
You can use Lake base Autoscaling with Synced Tables.
Synced tables empower you to export high quality & governed data from Lakehouse into Lake base. Its specifically designed for downstream applications requiring extreme responsiveness (sub 10ms responses) along with the reliability of ACID compliant transactions for real time cases.
You can sync gold tables from patient_details into a new synced table patient_details_synced. You can use sync modes based on needs
- Snapshot - One time copy of all data
- Triggered - Scheduled updates that run on demand or at intervals
- Continuous - Real time streaming with latency in seconds
You can check the resource Modern Databricks Lake base with Medical Inventory Management Apps here