I want to overwrite a Postgresql table transactionStats which is used by the customer facing dashboards.
This table needs to be updated every 30 mins. I am writing a AWS Glue Spark job via JDBC connection to perform this operation.
Spark dataframe write snippet -
df.write.format("jdbc").option("url", "connection_params").option("dbtable","public.transactionStats"). mode("overwrite").save()
I am seeing a downtime on the dashboard while the table is being overwritten
how to avoid downtime? Any solutions? Staging table?