- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-28-2023 12:56 PM
I am using a threadpool executor and running notebooks in parallel. However, these parallel notebooks are not using executors at all and all the load is going towards the driver node resulting in running out of memory for the driver node and eventually crashing.
The parallel notebooks are all same and involve creating huge pandas dataframes, spark dataframes, and appending them to delta tables. What am I missing? How do I redirect load to executor nodes?
- Labels:
-
Nodes
-
Parallel notebooks
-
Worker Nodes
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-07-2023 12:00 AM
@uzair mustafa : Using a threadpool executor to parallelize the execution of notebooks may not be enough to distribute the load across your cluster. When you use threadpool executor, all threads are running on the same node, might run out of memory as well -> this is the desired result.
To tackle your problem, can you try running each notebook as a separate process and create a Spark Context within that process. Please try using "subprocess" module in Python to spawn a new process for each notebook.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-07-2023 12:00 AM
@uzair mustafa : Using a threadpool executor to parallelize the execution of notebooks may not be enough to distribute the load across your cluster. When you use threadpool executor, all threads are running on the same node, might run out of memory as well -> this is the desired result.
To tackle your problem, can you try running each notebook as a separate process and create a Spark Context within that process. Please try using "subprocess" module in Python to spawn a new process for each notebook.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-12-2023 09:47 PM
Hi @uzair mustafa
Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.
Please help us select the best solution by clicking on "Select As Best" if it does.
Your feedback will help us ensure that we are providing the best possible service to you.
Thank you!

