Data bricks Write Performance
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-29-2021 07:56 AM
I have a requirement to replay ingestion from landing data and build silver table. I am trying to write delta file from raw Avro files based in landing zone. The raw files are located in folder based on date. I am currently using streaming to read data from files and write into delta file with Z order.
Issue is that it is taking 8-10 hours for for each day[around 20 million records]. Any pointers on how may I can improve performance ?Few I can think of are: I can use batch read instead of stream?Use Parquet instead of Delta ?Any other?
Thanks in advance for kind help
Labels:
0 REPLIES 0
![](/skins/images/B38AF44D4BD6CE643D2A527BE673CCF6/responsive_peak/images/icon_anonymous_message.png)
![](/skins/images/B38AF44D4BD6CE643D2A527BE673CCF6/responsive_peak/images/icon_anonymous_message.png)