Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-13-2022 11:57 AM
Can't I do something like this in PySpark
deltaTable.as("orginal_table")
.merge(df.as("update_table"), "orginal_table.state_code = update_table.state_code and orginal_table.attom_id = update_table.attom_id")
.whenMatched("orginal_table.sell_date < update_table.sell_date")
.updateAll()
.whenNotMatched()
.insertAll()
.execute()