- 718 Views
- 2 replies
- 1 kudos
How to create a lakebase table ?
Hi databricks community, I want to create a lakebase table that is synced with the delta table . So whenever the delta table is updated the changes should be available in lakebase table. Now I want to create a databricks streamlit application and ma...
- 718 Views
- 2 replies
- 1 kudos
- 1 kudos
Yes, it’s possible to have a Lakehouse table synced with a Delta table in Unity Catalog. You have a few options:Direct read: Register the Delta table in Unity Catalog and query it directly from your Streamlit app.Delta Live Tables (DLT): Create a DLT...
- 1 kudos
- 582 Views
- 1 replies
- 2 kudos
Resolved! Lakebase query history / details
Is there somehwere in Databricks that I can see details about queries run againt one of my Lakebase databases (similar to query history system tables)?What I'm ultimately trying to figure out is where the time is being spent between when I issue the ...
- 582 Views
- 1 replies
- 2 kudos
- 2 kudos
Hi @pdiamond ,Currently in beta there's a feature that let's you monitor active queries:https://docs.databricks.com/aws/en/oltp/projects/active-queriesAlso in beta there's Lakebase SQL editor that will allow you to analyze queries:https://docs.databr...
- 2 kudos
- 4277 Views
- 2 replies
- 1 kudos
Resolved! Syncing lakebase table to delta table
I have been exploring Lakebase and I wanted to know if there is a way to sync CDC data from Lakebase tables to delta table in Lakehouse. I know the other way is possible and that's what was shown in the demo. Can you tell how I can I sync both the ta...
- 4277 Views
- 2 replies
- 1 kudos
- 1 kudos
Just wanted to mention that the ETL from Lakebase to Delta Tables preview is mentioned here:https://www.databricks.com/blog/how-use-lakebase-transactional-data-layer-databricks-apps
- 1 kudos
- 861 Views
- 1 replies
- 1 kudos
Resolved! Lakebase / Feature Store error: “Failed to get identity details for username” (service principal)
Hello,I’m running into a Lakebase / Feature Store issue related to service principal authentication when trying to log or read from the Databricks Feature Store. Migrating from the legacy online tables. Here’s the exact error:psycopg2.OperationalErr...
- 861 Views
- 1 replies
- 1 kudos
- 1 kudos
The error you’re encountering —psycopg2.OperationalError: FATAL: Failed to get identity details for username: "user_uuid" — typically arises from an OAuth identity mismatch or invalid token scope when a Databricks service principal is used to authent...
- 1 kudos
- 1582 Views
- 3 replies
- 4 kudos
Lakebase security
Hi team,We are using Databricks Enterprise and noticed that our Lakebase instances are exposed to the public internet. They can be reached through the JDBC endpoint with only basic username and password authentication. Is there a way to restrict acce...
- 1582 Views
- 3 replies
- 4 kudos
- 4 kudos
Postgres instance is covered by the private link you configure to your workspace.
- 4 kudos
- 1875 Views
- 1 replies
- 3 kudos
Resolved! Lakebase storage location
Hi,I'm a Solution Architect from a reputed insurance company looking for few key technical information about Lakebase architecture. Being fully managed serverless OLTP offering from Databricks, there is no clear documentation that talks about data st...
- 1875 Views
- 1 replies
- 3 kudos
- 3 kudos
Hi @YugandharG ,1. Lakebase data is stored in databricks-managed cloud object storage. There's no option to use customer storage as of now.2. File format: vanilla postgres pages. The storage format of postgres has nothing to do with parquet/delta. Wa...
- 3 kudos
- 2019 Views
- 3 replies
- 2 kudos
Resolved! Lakebase Scale-to-Zero Behavior: Automatic or Application-Controlled?
Hi all,Lakebase is currently advertised as a database system that can scale down to zero:https://www.databricks.com/blog/what-is-a-lakebaseDoes anyone know if this scale-to-zero behavior is handled automatically by Databricks when the database is idl...
- 2019 Views
- 3 replies
- 2 kudos
- 2 kudos
Hey @mrp ,Some features of Lakebase are still in public preview, so not all functionality is available yet. @ilorus is correct that “scale to zero” is not currently part of the product. However, it is on the roadmap and should be available early nex...
- 2 kudos
- 1571 Views
- 5 replies
- 4 kudos
Resolved! Start and stop lakebase instance
Hello there!I have been using the databricks-sdk for a while, and I have managed to create a system where I have controlled when the clusters and applications start and stop.However, now that we have adopted the new lakebase feature, we were wonderin...
- 1571 Views
- 5 replies
- 4 kudos
- 4 kudos
Sure, @juanjomendez96 . You just need to use following code from python SDK:from databricks.sdk import WorkspaceClientfrom databricks.sdk.service.database import DatabaseInstance# Initialize the Workspace clientw = WorkspaceClient()# Stop a database...
- 4 kudos
- 2149 Views
- 2 replies
- 0 kudos
Lakebase auto start/stop
It doesn’t appear that the Lakebase OLTP instances function like Sql warehouses in the following ways:• automatically starting when initiating a connection• automatically stopping after no usage in x minutesI am wondering if others have use cases for...
- 2149 Views
- 2 replies
- 0 kudos
- 0 kudos
I guess that the start/stop pattern is not something you want in a low-latency OLTP database.Perhaps they add it in the future.
- 0 kudos
- 3968 Views
- 1 replies
- 2 kudos
Resolved! Lakebase use cases
1. What are the use cases for Lakebase? When should I use the Lakebase Postgres over delta tables?2. What are the differences between open-source Postgres and Lakebase?3. Should I utilize Lakebase for all OLTP requirements?
- 3968 Views
- 1 replies
- 2 kudos
- 2 kudos
Hi @Sharanya13 ,1. Use Lakebase whenever you have application workload (OLTP) and you require low latency. For analytical workloads use Lakehouse. Here you have couple of example use cases from documentation:Serving data and/or features from the lake...
- 2 kudos
- « Previous
-
- 1
- 2
- Next »
| User | Count |
|---|---|
| 11 | |
| 4 | |
| 4 | |
| 2 | |
| 2 |