โ04-05-2022 04:50 AM
Hi All, All of a sudden in our Databricks dev environment, we are getting exceptions related to memory such as out of memory , result too large etc.
Also, the error message is not helping to identify the issue.
Can someone please guide on what would be the starting point to look into it.
I am getting this issue while reading a json file and dumping it into a dataframe.
โ04-05-2022 05:20 AM
OOM error is quite common, it means that probably your partitions are too large to fit into memory. Please analyze your SPARK UI - look for data spills, and skews and try to use smaller partitions or/and more shuffle partitions.
โ04-05-2022 05:20 AM
OOM error is quite common, it means that probably your partitions are too large to fit into memory. Please analyze your SPARK UI - look for data spills, and skews and try to use smaller partitions or/and more shuffle partitions.
โ04-06-2022 02:43 AM
Thanks for the response @Hubert Dudekโ .
if i run the same code in test environment , its getting successfully completed and in dev its giving out of memory issue. Also the configuration of test nand dev environment is exactly same.
โ04-29-2022 03:17 PM
Hi @Pavan Bangadโ ,
Just a friendly follow-up. Did @Hubert Dudekโ 's response help you to resolve your issue? if yes, please select it as best answer. If not, please let us know, so we can continue helping you to find a solution.
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now