Resolved! Optimizing Writes from Databricks to Snowflake
My job after doing all the processing in Databricks layer writes the final output to Snowflake tables using df.write API and using Spark snowflake connector. I often see that even a small dataset (16 partitions and 20k rows in each partition) takes a...
- 7954 Views
- 6 replies
- 2 kudos
Latest Reply
There are few options I tried out which had given me a better performance.Caching the intermediate or final results so that while writing the dataframe computation does not repeat again. Coalesce the results into the partitions 1x or 0.5x your number...
- 2 kudos