Efficiently Delete/Update/Insert Large Datasets of Records in PostgreSQL from Spark DataFrame
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-21-2025 07:42 AM
So I am migrating my ETL Process from Pentaho to Databricks I am using pyspark.
I have posted all the details here Staging Ground: How to Insert,Update,Delete data using databricks for large records in PostgreSQL fr...
Can anyone please help me giving the solution to optimize the way to do the operations?