cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Sakthi0311
by New Contributor
  • 174 Views
  • 2 replies
  • 1 kudos

How to enable Liquid Clustering on an existing Delta Live Table (DLT) and syntax for enabling it

 Hi all,I’m working with Delta Live Tables (DLT) and want to enable Liquid Clustering on an existing DLT table that was already created without it.Could someone please clarify:How can I enable Liquid Clustering on an existing DLT table (without recre...

  • 174 Views
  • 2 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @Sakthi0311 ,For SQL language you can enable LC for materialized views and streaming tables. So the syntax looks following:If you want to use automatic clustering then use CLUSTER BY AUTO. 

  • 1 kudos
1 More Replies
excavator-matt
by Contributor
  • 645 Views
  • 6 replies
  • 3 kudos

Resolved! How do use Databricks Lakeflow Declarative Pipeline on AWS DMS data?

Hi!I am trying to replicate an AWS RDS PostgreSQL database in Databricks. I have successfully manage to enable CDC using AWS DMS that writes an initial load file and continuous CDC files in parquet.I have been trying to follow the official guide Repl...

Data Engineering
AUTO CDC
AWS DMS
declarative pipelines
LakeFlow
  • 645 Views
  • 6 replies
  • 3 kudos
Latest Reply
mmayorga
Databricks Employee
  • 3 kudos

hey @excavator-matt  Let's remember that the Bronze layer is for mere raw ingestion; this provides a baseline for auditing and to start applying transformations based on the different use cases you need to serve. Systems and their requirements change...

  • 3 kudos
5 More Replies
Adam_Borlase
by New Contributor III
  • 338 Views
  • 4 replies
  • 5 kudos

Resolved! Quota Limit Exhausted Error when Creating Data Ingestion with SQL Server Connector (Azure)

Good Day all,I am having an issue with our first Data Ingestion Pipelines, I am wanting to connect to our Azure SQL Server with our Unity Connector (I can access the data in Unity Catalog). When I am on Step 3 of the process (Source) when it is scann...

  • 338 Views
  • 4 replies
  • 5 kudos
Latest Reply
Adam_Borlase
New Contributor III
  • 5 kudos

you for all of your assistance!

  • 5 kudos
3 More Replies
ghofigjong
by New Contributor
  • 11295 Views
  • 5 replies
  • 3 kudos

Resolved! How does partition pruning work on a merge into statement?

I have a delta table that is partitioned by Year, Date and month. I'm trying to merge data to this on all three partition columns + an extra column (an ID). My merge statement is below:MERGE INTO delta.<path of delta table> oldData using df newData ...

  • 11295 Views
  • 5 replies
  • 3 kudos
Latest Reply
Umesh_S
New Contributor II
  • 3 kudos

Isn't the suggested idea only filtering the input dataframe (resulting in a smaller amount of data to match across the whole delta table) rather than prune the delta table for relevant partitions to scan?

  • 3 kudos
4 More Replies
yit
by Contributor III
  • 157 Views
  • 3 replies
  • 3 kudos

Resolved! Does Autoloader supports loading PDF files?

I need to process PDF files already ingested. Based on the documentation, Autoloader does not support PDFs - or am I missing something?Also, I've found this sparkPDF library in other discussions in the community, but from what I see it's only for bat...

  • 157 Views
  • 3 replies
  • 3 kudos
Latest Reply
yit
Contributor III
  • 3 kudos

Any suggestions how to handle PDFs? @szymon_dybczak 

  • 3 kudos
2 More Replies
Filip
by New Contributor II
  • 6912 Views
  • 7 replies
  • 0 kudos

How to Assign User Managed Identity to DBR Cluster so I can use it for quering ADLSv2?

Hi,I'm trying to figure out if we can switch from Entra ID SPN's to User Assigned Managed Indentities and everything works except I can't figure out how to access the lake files from python notebook.I've tried with below code and was running it on a ...

  • 6912 Views
  • 7 replies
  • 0 kudos
Latest Reply
Coffee77
Contributor
  • 0 kudos

Besides, this only works in dedicated clusters, non working on shared ones. Why? No idea at all. Latest case, IMDS (Internal Metadata Service) used by Azure to inject token endpoint inside resources as a unique secure and valid channel to get tokens ...

  • 0 kudos
6 More Replies
Phani1
by Valued Contributor II
  • 2717 Views
  • 1 replies
  • 0 kudos

Genie Integrating with streamlit

Hi All,What are the best practices to follow while integrating with Genie and streamlit , and are there any limitations?how best way to present in UI level on user perceptive ?Regards,Phani

  • 2717 Views
  • 1 replies
  • 0 kudos
Latest Reply
AbhaySingh
Databricks Employee
  • 0 kudos

This should help you get started. Please let us know if you've any specific question after you've looked at the links below. https://blog.streamlit.io/best-practices-for-building-genai-apps-with-streamlit/ https://databrickster.medium.com/call-genie-...

  • 0 kudos
AkhileshVB
by New Contributor
  • 2367 Views
  • 2 replies
  • 1 kudos

Resolved! Syncing lakebase table to delta table

I have been exploring Lakebase and I wanted to know if there is a way to sync CDC data from Lakebase tables to delta table in Lakehouse. I know the other way is possible and that's what was shown in the demo. Can you tell how I can I sync both the ta...

  • 2367 Views
  • 2 replies
  • 1 kudos
Latest Reply
Malthe
Contributor II
  • 1 kudos

Just wanted to mention that the ETL from Lakebase to Delta Tables preview is mentioned here:https://www.databricks.com/blog/how-use-lakebase-transactional-data-layer-databricks-apps 

  • 1 kudos
1 More Replies
julius_bkr
by New Contributor II
  • 192 Views
  • 3 replies
  • 3 kudos

Resolved! Hive Metastore End of Life

Hello everyone,is there a rough date on which the Hive Metastore will be deactivated?In the end, I ask the question again, which was already asked 2 years ago:Solved: Hive metastore table access control End of Support - Databricks Community - 50487We...

  • 192 Views
  • 3 replies
  • 3 kudos
Latest Reply
Abhishek_Patel
New Contributor II
  • 3 kudos

HI @julius_bkr I do not think Databricks has any plans to retire HMA completely. However, Unity Catalog is the strategic direction and the recommended approach for new deployments and migrating existing data governance. Databricks is investing heavil...

  • 3 kudos
2 More Replies
ismaelhenzel
by Contributor
  • 4724 Views
  • 2 replies
  • 1 kudos

Delta live tables - foreign keys

I'm creating ingestions using delta live tables, the dlt support the use of schema, with constraints like foreign keys. The problem is: how can i create foreign keys between the same pipeline, that has no read/write relation, but has foreign key rela...

  • 4724 Views
  • 2 replies
  • 1 kudos
Latest Reply
User12350
New Contributor II
  • 1 kudos

We ran into similar issues with our client, while migrating their on-prem table relationships to Databricks via DLT/LDP.The first proposed solution does not work on Materialized Views (MV).> 'ALTER TABLE ... ADD CONSTRAINT' expects a table but <objec...

  • 1 kudos
1 More Replies
flashmav1
by New Contributor II
  • 301 Views
  • 6 replies
  • 3 kudos

Resolved! [NUMERIC_VALUE_OUT_OF_RANGE.WITHOUT_SUGGESTION] The -12874815911431.6200000000 rounded half up from

I am using dataricks version 15.4 and getting below error whicle reading from jdbc and writing to aws S3 location: [NUMERIC_VALUE_OUT_OF_RANGE.WITHOUT_SUGGESTION] The -12874815911431.6200000000 rounded half up from -12874815911431.6200000000 cannot b...

  • 301 Views
  • 6 replies
  • 3 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 3 kudos

 

  • 3 kudos
5 More Replies
Lenoq
by New Contributor II
  • 241 Views
  • 4 replies
  • 2 kudos

Are there SQL linters for Databricks GUI queries and notebook %sql cells?

 I'm looking for SQL linters in two different contexts within Databricks GUI:SQL queries in Databricks SQL Editor (GUI) - Is there a built-in linter for writing SQL queries in the Databricks SQL workspace?%sql magic cells in Databricks notebooks (GUI...

  • 241 Views
  • 4 replies
  • 2 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 2 kudos

in databricks notebook is possible to connect own linters but I don't think it will work in SQL editor.

  • 2 kudos
3 More Replies
thatsawinner
by New Contributor II
  • 344 Views
  • 4 replies
  • 1 kudos

Resolved! Notebook Session Has Crashed

I am getting a pop-up error message in the right hand corner of my Databricks session, "Your notebook session has crashed."  This is a notebook I've been working in for a while.The only line of code I am running is pip listThe error at the bottom of ...

  • 344 Views
  • 4 replies
  • 1 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 1 kudos

From the screen, it looks like an interactive cluster—maybe try rebooting it.

  • 1 kudos
3 More Replies
Lenoq
by New Contributor II
  • 152 Views
  • 1 replies
  • 1 kudos

Resolved! Looking for CLI-based SQL formatter for Databricks: Alternative to gethue/sql-formatter for .

I'm looking for a SQL formatter (CLI) for two different contexts within Databricks.1. SQL queries in .sql files - What tool does Databricks use by default to format SQL?2. %sql magic cells in notebooks (.ipynb files) - What tool does Databricks use b...

  • 152 Views
  • 1 replies
  • 1 kudos
Latest Reply
ilir_nuredini
Honored Contributor
  • 1 kudos

Hi @Lenoq ,As far as I know, currently there is no tool that meets your given criteria by databricks/labs. The closest that I can see is the tool you gave as an example, but as you said its not that good. what I would do maybe is an intermedial optio...

  • 1 kudos
Prasanna_N
by New Contributor
  • 3275 Views
  • 2 replies
  • 2 kudos

Inference table Monitoring

i have data from march1 to march 14 in the final inference table and i have given 1 week granularity. after that profile and drift table is generated and i see the window start time as like this objectstart: "2025-02-24T00:00:00.000Z"end: "2025-03-03...

  • 3275 Views
  • 2 replies
  • 2 kudos
Latest Reply
AbhayPSingh
Databricks Employee
  • 2 kudos

More or less repeating what Mark said and adding some additional thoughts. Why the Window Starts from February 24 The reason you're seeing a window starting from February 24 (even though your data starts March 1) is because monitoring systems align t...

  • 2 kudos
1 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels