cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Azure Databricks container runtime broken in 9.1 LTS, how to fix?

HQJaTu
New Contributor III

For stability, I've stuck with LTS. Last Friday my containers stopped working with error message:

Py4JException: An exception was raised by the Python Proxy. Return Message: Traceback (most recent call last):
  File "/databricks/spark/python/lib/py4j-0.10.9-src.zip/py4j/java_gateway.py", line 2442, in _call_proxy
    return_value = getattr(self.pool[obj_id], method)(*params)
  File "/databricks/python_shell/dbruntime/pythonPathHook.py", line 45, in initStartingDirectory
    os.chdir(directory)
FileNotFoundError: [Errno 2] No such file or directory: '/Workspace/Repos/Git-repos/'

After spending hours and hours of troubleshooting and nothing working, I went for runtime 10.2 (the non-LTS -version), which does not have this problem.

It looks like a regression problem to me. JRE got updated for sure (Log4J-fixes) and this inability to run containers wasn't properly covered with automated tests.

Any ideas how to get past the error? As this error happens before Python, there is very little I can do to fix this in my container.

1 ACCEPTED SOLUTION

Accepted Solutions

HQJaTu
New Contributor III

Latest node version seems to work for me. Choosing different containers wasn't a factor, the problem was with LXC. I haven't done any follow-up on this, also my containers are custom, see my pull request https://github.com/databricks/containers/pull/73

View solution in original post

12 REPLIES 12

Hubert-Dudek
Esteemed Contributor III

First time I see that error but you can try disable/enable repos in admin settings in your databricks workspace

HQJaTu
New Contributor III

If I disable repos, from where do my notebooks come from?

Hubert-Dudek
Esteemed Contributor III

I thought just for test to check is it help. Maybe re-enabling it will recover structure.

Additionally that error looks like git is disabled...

(notebooks come from workspace or git repo which is part of workspace)

HQJaTu
New Contributor III

On a second thought, your idea is something I might try. However, by using a newer bricks node -version the problem dissolves. Mind you, I've been using LTS-node since May 2021 without problems.

All this leaves me conflicted. Should I reconnect the git-repo and see if that would help or ignore the fact LTS-version is unusable with Python-containers and keep my current compute-configuration.

Anonymous
Not applicable

@Jari Turkia​ - Checking in to see how it went.

HQJaTu
New Contributor III

This is getting worse. Now JDBC write to SQL is failing for same reason. I haven't yet found a solution for this.

Am I not supposed to use containers? Python?

This is not cool. 😞

Atanu
Esteemed Contributor
Esteemed Contributor

We need detail log what's is happening inside to troubleshoot this. Do you have our support access?

HQJaTu
New Contributor III

No. All I have is access to Azure with Bricks in it.

Kaniz
Community Manager
Community Manager

Hi @Jari Turkia​ , Can you please share the log details?

HQJaTu
New Contributor III

Long time has passed and I have a solution to get things working. I do have the logs, but haven't really paid much attention to this lately.

To me, this is a regression issue in Azure Databricks. When a new version of a node is released, it should be tested with containers too. If old node versions are not, maybe I should be informed about it.

Kaniz
Community Manager
Community Manager

Hi @Jari Turkia​ , Thank you for your response. Would you like to share the solution here as well?

HQJaTu
New Contributor III

Latest node version seems to work for me. Choosing different containers wasn't a factor, the problem was with LXC. I haven't done any follow-up on this, also my containers are custom, see my pull request https://github.com/databricks/containers/pull/73

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.