Hello Community,
I have a question about migrating data from PostgreSQL to Databricks. My PostgreSQL database receives new data every hour, and I want to synchronize these hourly inserts with the bronze layer in my Databricks catalog.
Currently, Iโm using JDBC to schedule a workflow that syncs the data from PostgreSQL to Databricks. However, each hourly batch contains around 10 million records, making this process challenging. Is there a simpler or more efficient solution to achieve this synchronization?