Hello @Parth2692!
Itโs possible that your dev and prod environments have different serverless configurations, which could explain the difference in behavior.
You can try increasing the notebook memory by switching from Standard to High in the Environment side panel. However, note that this doesnโt affect the Spark executor memory, which canโt be manually configured when using serverless compute.
If the issue persists, optimize your Spark job to reduce memory usage by splitting large jobs into smaller tasks, avoiding unnecessary caching, or adjusting how the data is partitioned.