3 weeks ago
i am trying to run a piece of code and everytime i am getting "Notebook detached
รException when creating execution context: java.net.SocketTimeoutException: Connect Timeout". Can anyone please tell if there is any issue going on for the databrics community edition clusters?
2 weeks ago
Update: The issue with Community Edition clusters has been mitigated. If you're still encountering the error, please try restarting your cluster. New cluster creation has been validated and is working now.
3 weeks ago
Could be an intermittent issue. Has this code run before on Community Edition? Remember, Community Edition has very limited functionality and very low limits on data and cpu usage. What is the command you are running, what is the data volume like? More information is needed. Thanks, Louis.
3 weeks ago
Its just a simple copying of files of very small size. But i dont know i just created one more account and the command is running fine on that workspace. It might be some intermittent issue.
3 weeks ago
I am running into the same issue. I am just trying to initialize a spark session. Please let me know if there is a fix.
3 weeks ago
I just created one more account and the command is running fine on that workspace. It might be some intermittent issue. Try using the same if that works for you.
2 weeks ago
Hello,
I just started with the learning and while reading a file I am getting the same issue "Getting exception when creating execution content : java.net.SocketTimeoutException: Connect Timeout" I am using the below mentioned command
2 weeks ago
Have you tried dropping and recreating the cluster? CE is known to have cluster outage issues. Often, dropping and re-creating solves the problem.
2 weeks ago
I'm having the same issue. I tried all that but it's still not working.
2 weeks ago
I am not not being able to create a cluster since last Thursday? Any solution?
2 weeks ago
I've been trying to use the Community edition and am running into issues today. I cannot create cluster today?
Any idea when this issue will get resolved?
2 weeks ago
2 weeks ago
2 weeks ago
Below worked for me:
In Cluster configuration change the Spark config to "spark.databricks.session.noContextTimeout 60" and Restart.
2 weeks ago
Faced this issue multiple times this past week. During day time, at night or early morning it works sometimes.
2 weeks ago
I've flagged this issue with our internal team, and our technical team is actively working to resolve it as quickly as possible. Thank you all for your patience.
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now