cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

pskchai
by New Contributor
  • 1792 Views
  • 2 replies
  • 0 kudos

Resolved! Using DLT with a non-streaming large table

We have a source table that receives daily append operations, but the rows created within the last 30 days in this table can be updated or deleted. Thus, the source table is not exactly a streaming source.Our processing workflow involves performing "...

  • 1792 Views
  • 2 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Pongsakorn Chairatanakul​ Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please...

  • 0 kudos
1 More Replies
gilo12
by New Contributor III
  • 9249 Views
  • 3 replies
  • 2 kudos

merge into deletes from SOURCE

I am using the following query to make an upsert:MERGE INTO my_target_table AS target USING (SELECT MAX(__my_timestamp) AS checkpoint FROM my_source_table) AS source ON target.name = 'some_name' AND target.address = 'some_address' WHEN MATCHED AN...

  • 9249 Views
  • 3 replies
  • 2 kudos
Latest Reply
gilo12
New Contributor III
  • 2 kudos

I was using a view for my_source_table, once I changed that to be a table the issue stoped.That unblocked me, but I think Databricks has a bug with using MERGE INTO from a VIEW

  • 2 kudos
2 More Replies
gg_047320_gg_94
by New Contributor II
  • 8080 Views
  • 1 replies
  • 1 kudos

DLT Spark readstream fails on the source table which is overwritten

I am reading the source table which gets updated every day. It is usually append/merge with updates and is occasionally overwritten for other reasons. df = spark.readStream.schema(schema).format("delta").option("ignoreChanges", True).option('starting...

  • 8080 Views
  • 1 replies
  • 1 kudos
Latest Reply
Debayan
Databricks Employee
  • 1 kudos

Hi, Could you please confirm DLT and DBR versions? Also please tag @Debayan​ with your next response which will notify me, Thank you!

  • 1 kudos
weldermartins
by Honored Contributor
  • 7853 Views
  • 9 replies
  • 13 kudos

Resolved! Delta table upsert - databricks community

Hello guys,I'm trying to use upsert via delta lake following the documentation, but the command doesn't update or insert newlines.scenario: my source table is separated in bronze layer and updates or inserts are in silver layer.from delta.tables impo...

  • 7853 Views
  • 9 replies
  • 13 kudos
Latest Reply
weldermartins
Honored Contributor
  • 13 kudos

I managed to find the solution. In insert and update I was setting the target.tanks @Werner Stinckens​ !delta_df = DeltaTable.forPath(spark, 'dbfs:/mnt/silver/vendas/')     delta_df.alias('target').m...

  • 13 kudos
8 More Replies
577391
by New Contributor II
  • 2300 Views
  • 2 replies
  • 0 kudos

Resolved! How do I merge two tables and track changes to missing rows as well as new rows

In my scenario, the new data coming in are the current, valid records. Any records that are not in the new data should be labeled as 'Gone", any matching records should be labeled with "Updated". And finally, any new records should be added.So in sum...

  • 2300 Views
  • 2 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

Detection deletions does not work out of the box.The merge statement will evaluate the incoming data against the existing data. It will not check the existing data against the incoming data.To mark deletions, you will have to specifically update tho...

  • 0 kudos
1 More Replies
_Orc
by New Contributor
  • 2876 Views
  • 2 replies
  • 1 kudos

Resolved! Checkpoint is getting created even the though the microbatch append has failed

Use caseRead data from source table using structured spark streaming(Round the clock).Apply transformation logic etc etc and finally merge the dataframe in the target table.If there is any failure during transformation or merge ,databricks job should...

  • 2876 Views
  • 2 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Om Singh​ Hope you are doing well. Just wanted to check in and see if you were able to find a solution to your question?Cheers

  • 1 kudos
1 More Replies
Labels