Increase stack size Databricks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-03-2024 08:56 AM - edited 06-03-2024 10:34 AM
Hi everyone
I'm currently running a shell script in a notebook, and I'm encountering a segmentation fault. This is due to the stack size limitation. I'd like to increase the stack size using ulimit -s unlimited, but I'm facing issues with setting this limit in the notebook environment.
I am using:
Could anyone provide guidance on how to properly increase the stack size for my shell script using Notebooks in Databricks? Any tips or alternative solutions to avoid the segmentation fault would also be greatly appreciated.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-06-2024 08:45 AM
Hi @Retired_mod ,
Thanks for your response. I tried this and unfortunately I could not get it to work.
When I set spark.databricks.driver.maxReplOutputLength to unlimited in the cluster configurations, I got this error message when running in the Notebook: Failure starting repl. Try detaching and re-attaching the notebook. I tried detaching and re-attaching the cluster and continued to get the same message. Looking into it more, it looks like it has to be set to an integer value. I also tried this on the web terminal and I continued to get the segmentation fault error.
Next, I tried setting spark.databricks.driver.maxReplOutputLength to a very high number (e.g. 500000000) and received the same segmentation fault error when running it in the Notebook and web terminal.
Do you have any other ideas of things I could try?

