ā06-10-2025 10:53 PM
Hi everyone,
I've tried setting the Spark configuration spark.databricks.repl.timeout to 300, but Iām still getting a REPL timeout error saying it took longer than 60 seconds. It seems like the configuration might be incorrect. Can someone guide me on the correct Spark config to change in order to increase the REPL timeout?
Ref to last conversation: https://community.databricks.com/t5/data-engineering/intermittent-timeout-error-while-waiting-for-py...
ā06-11-2025 09:23 AM
Hi @mkwparth
Good day!!
We get the below error when IPython kernel doesn't start within 80 seconds, the session might fail or timeout with a launch error.
To overcome the issue. Could you please increase the timeout by specifying config
spark.databricks.driver.ipykernel.launchTimeoutSeconds to 300 (5 min, default is 1.2 min). And let us know if you still see the error after applying the configuration.
ā06-11-2025 11:09 PM
Hi @Saritha_S ,
Yes! I've configured spark config that you said. I'll observe for few days and let you know.
Thanks! For your Help.
ā06-11-2025 09:23 AM
Hi @mkwparth
Good day!!
We get the below error when IPython kernel doesn't start within 80 seconds, the session might fail or timeout with a launch error.
To overcome the issue. Could you please increase the timeout by specifying config
spark.databricks.driver.ipykernel.launchTimeoutSeconds to 300 (5 min, default is 1.2 min). And let us know if you still see the error after applying the configuration.
a month ago
Hi @Saritha_S
I've updated the Spark configuration as per your suggestion, but I'm still encountering issues.
Out of 6 retry attempts, 4 resulted in the following error:
com.databricks.pipelines.common.errors.deployment.DeploymentException: Communication lost with driver.
The remaining 2 attempts failed due to a REPL timeout error. See the attached screenshots.
Could you please advise on what steps I can take to prevent these errors in the future? Let me know if you need any additional information from my side.
Thanks in advance for your Help!
ā06-11-2025 11:09 PM
Hi @Saritha_S ,
Yes! I've configured spark config that you said. I'll observe for few days and let you know.
Thanks! For your Help.
Passionate about hosting events and connecting people? Help us grow a vibrant local communityāsign up today to get started!
Sign Up Now