- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-05-2022 04:50 AM
Hi All, All of a sudden in our Databricks dev environment, we are getting exceptions related to memory such as out of memory , result too large etc.
Also, the error message is not helping to identify the issue.
Can someone please guide on what would be the starting point to look into it.
I am getting this issue while reading a json file and dumping it into a dataframe.
- Labels:
-
Error Message
-
Memory
-
Sqlserver
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-05-2022 05:20 AM
OOM error is quite common, it means that probably your partitions are too large to fit into memory. Please analyze your SPARK UI - look for data spills, and skews and try to use smaller partitions or/and more shuffle partitions.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-05-2022 05:20 AM
OOM error is quite common, it means that probably your partitions are too large to fit into memory. Please analyze your SPARK UI - look for data spills, and skews and try to use smaller partitions or/and more shuffle partitions.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-06-2022 02:43 AM
Thanks for the response @Hubert Dudek .
if i run the same code in test environment , its getting successfully completed and in dev its giving out of memory issue. Also the configuration of test nand dev environment is exactly same.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-29-2022 03:17 PM
Hi @Pavan Bangad ,
Just a friendly follow-up. Did @Hubert Dudek 's response help you to resolve your issue? if yes, please select it as best answer. If not, please let us know, so we can continue helping you to find a solution.

