- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-08-2023 12:45 PM
I want to overwrite a Postgresql table transactionStats which is used by the customer facing dashboards.
This table needs to be updated every 30 mins. I am writing a AWS Glue Spark job via JDBC connection to perform this operation.
Spark dataframe write snippet -
df.write.format("jdbc").option("url", "connection_params").option("dbtable","public.transactionStats"). mode("overwrite").save()
I am seeing a downtime on the dashboard while the table is being overwritten
how to avoid downtime? Any solutions? Staging table?
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-09-2023 03:53 AM
can you use rename table? so load a temp table, rename target table to something else, rename temp table to target table.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-09-2023 03:53 AM
can you use rename table? so load a temp table, rename target table to something else, rename temp table to target table.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-15-2023 12:09 AM
Hi @Siddharth Kanojiya
We haven't heard from you since the last response from @werners (Customer) . Kindly share the information with us, and in return, we will provide you with the necessary solution.
Thanks and Regards