Hi,
I wanted to understand whether my approach to deal with delta lake is correct or not?
1. First time I create a delta lake using the following command.
-> df_json.write.mode('overwrite').format('delta').save(delta_silver + json_file_path )
2. I create the delta table for the above location using the following command.
-> deltaTableStore = DeltaTable.forPath(spark, delta_silver + detlta_table1)
3. I perform some operations on this deltaTableStore, such as I merge some data to this delta lake frame.
4. My understanding is that, deltaTableStore.merge() operation will update the data in the persistance also. I need do the write back operation.
Please let me know my understanding is correct.