Lakebase Articles
A structured knowledge hub for Lakebase. Find in-depth technical content, how-to guides, and referen...
A structured knowledge hub for Lakebase. Find in-depth technical content, how-to guides, and referen...
Discover curated blogs from the community and experts. Learn through practical guides, use cases, an...
Ask questions, share challenges, and connect with others working on Lakebase. From troubleshooting t...
The Problem Nobody Likes to Admit.Imagine this scenario: your data team has built a flawless lakehouse. Ingest pipelines, bronze/silver/gold tiers, gleaming dashboards. Everything is working perfectly.Until someone asks: "And the production app? Wher...
Hi DB Community ,Is there any way to access/write to delta table in UC from lakebase postgres ? There's a way using "Sync Table" - but it is recommended only to read data from Sync Table . Databricks recommends against writing to sync table . Or else...
Currently, direct write-back from Lakebase PostgreSQL to Unity Catalog Delta tables is not the recommended pattern.A few points that may help:• “Sync Tables” are mainly designed for replication/read access scenarios. Databricks currently recommends a...
"The data platform we’ve built with Databricks gives us a treasure trove of usable, enriched data that sets us apart from anyone else in the industry. It’s the foundation for solving problems no one else can." - Grant Veazey, CTO, EnsembleEnsemb...
Hello Data Professionals, I need to tell you about something.I have been working with data platforms for a long time. Long enough to remember when "big data" was the buzzword, when Hadoop was the answer to everything, when data lakes were going to re...
Healthcare organizations have invested heavily in the modern Lakehouse architectures for Enterprise Data Analytics, AI and governance in the last decade. Care systems, claims platforms, imaging applications, pharmacy systems, device telemetry and pat...
Hi databricks community, I want to create a lakebase table that is synced with the delta table . So whenever the delta table is updated the changes should be available in lakebase table. Now I want to create a databricks streamlit application and ma...
You can use Lake base Autoscaling with Synced Tables.Synced tables empower you to export high quality & governed data from Lakehouse into Lake base. Its specifically designed for downstream applications requiring extreme responsiveness (sub 10ms resp...
It doesn’t appear that the Lakebase OLTP instances function like Sql warehouses in the following ways:• automatically starting when initiating a connection• automatically stopping after no usage in x minutesI am wondering if others have use cases for...
Lake base supports Autoscaling & Scale to Zero with Automatic Suspension & Reactivation capabilitiesAutomatic Suspension - Lake base compute automatically suspends after a period of inactivity (default is 5 minutes).Reactivation - The compute automat...
Patient Journey in Healthcare is about continuity. The care story unfolds over years & decisions are shaped by history. Every assessment depends not just on current symptoms but everything that came before. However, most AI systems introduced into he...
Community Space for Lakebase Insights Are you building with Lakebase? If so, do you have a single source to stay ahead of every technical shift and architectural breakthrough? More importantly, do you know what practitioners are actually saying o...
Hi @Tushar_Parekar Thank you for the mention. I truly appreciate it. I’m a big fan of Databricks, and I’ll continue to contribute to this community with my best efforts.Regards,Brahma
In many data projects, analytics and operational systems still live in separate worlds. One platform is used for reporting, dashboards, and AI. Another system is used for application data, transactions, and day-to-day business activity.That setup is ...
Hi all,I’m trying to set up REST-based communication between my Lakebase and a REST-client.I’m following the documentation “Connecting to Lakebase via REST using a service principal” to obtain a workspace-level token. After that, I use the Lakebase D...
I created a new Lakebase project to retrace all my steps. 0- I reused my service principal on the workspace1- installed databricks authentication extension: CREATE EXTENSION IF NOT EXISTS databricks_auth;2-Added the lakehouse service principal to the...
Greetings,I am following course " Get Started with Lakebase Course". I got access to "Databricks Academy Labs" but I could not find documentation related to "Get Started with Lakebase Course". I appreciate any help. Thanks,Sri
@Sumit_7 , Thank you for sharing the course URL. I have been working through the same course. Please refer to the 4th lesson, “Demo: Creating and Exploring a Lakebase Project.”I was not able to find any related documentation—specifically the tables d...
Lakebase Postgres now supports customer‑managed keys (CMK), so security teams can keep encryption keys in their own cloud KMS (AWS KMS, Azure Key Vault, or Google Cloud KMS) while Databricks runs Lakebase as a managed service. Key highlights Your key...
I can't view my lakebase postgres autoscaling project's table tab.The error id is: 3760588e08b843c5a4aebac770f8e967
Hi @ctgchris,I'm experiencing the same issue — unable to view the table tab on Lakebase Postgres Autoscaling. Has there been any update or resolution on this? Any workaround would be appreciated as well.Thanks!
Hi everyone,I recently started exploring Lakebase on Databricks, and I wanted to share a few thoughts from my early experience.What I like most so far is how it simplifies things. Instead of dealing with multiple tools for storage, processing, and AI...
@Brahmareddy Good to hear, definitely going to try now.
| User | Count |
|---|---|
| 11 | |
| 10 | |
| 9 | |
| 9 | |
| 6 |