Hi @WearBeard By default, streaming tables require append-only sources. The encountered error is due to an update or delete operation on the 'streaming_table_test'. To fix this issue, perform a Full Refresh on the 'streaming_table_test' table.
You ca...
Hi @Fnazar
When dealing with streaming data, you might end up with many small files, which can be inefficient. Use Delta Lake's OPTIMIZE command to compact files into larger ones and ZORDER to colocate related information in the same set of files. T...
Hi @ChristianRRL Thank you for reaching out.
Labeling table properties with "quality": "<specific_medallion>" (like bronze, silver, gold) in DLT serves a practical purpose in a data architecture pattern known as the "Medallion Architecture."
This ar...
Hi @rt-slowth I would like to share with you the Databricks documentation, which contains details about stream-static table joins
https://docs.databricks.com/en/delta-live-tables/transform.html#stream-static-joins
Stream-static joins are a good choic...
Hi @israelst
When working with Kinesis in Databricks, you can effectively handle various data formats including JSON, Avro, or bytes. The key is to appropriately decode the data in your Spark application.
Before reading your stream, define your data...