cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Help Needed: Errors with df.display() and df.show() in Databricks

yvishal519
Contributor

Dear Databricks Community,

I am reaching out to you for assistance with some issues I'm encountering in my Databricks environment. I'm hoping the community can provide some guidance to help me resolve these problems.

1. Error with df.display(): When I try to use the df.display() function, I'm receiving the following error:
"'Failed to store the result. Try rerunning the command. Failed to upload command result to DBFS. Error message: PUT request to create file error HttpResponse Proxy(HTTP/1.1 404 The specified filesystem does not exist. [Content-Length: 175. Content-Type: application/json;charset=utf-8, Server: Windows-Azure-HDFS/1.0 Microsoft-HTTPAPI/2.0, x-ms-error-code: FilesystemNotFound, x-ms-request-id: 33854442-8011- 0028-3da6-bc0285000000, x-ms-version: 2021-04-10, Date: Wed, 12 Jun 2024 08:59:45 GMT] Response Entity Proxy([Content-Type: application/json;charset=utf-8 Content-Length: 175, Chunked: false])) "

2. Error with df.show():
I'm also facing a similar issue when using df.show() for data frames with more than 10-20 rows. The same error as mentioned in point 1 appears.

Debugging insights:
1. During my investigation, I found that the issue seems to be related to Databricks trying to access the root storage and failing to write the data.

error on cluster logs: '' ERROR PresignedUrlClientUtils$: FS_OP_CREATE FILE[https://<root_storage-account-name>.dfs.core.windows.net/jobs/4079550013704479
/command-results/4056370604825597/b3041abc-2eeb-45da-90b7-23d98973d4d0] Presigned URL: Failed to upload stream using AzureAdl2SasUri

2. I tried to upload some files to DBFS (Databricks File System) from the UI, but it's throwing an error that says 'The Azure Container Does Not exist' and also a 500 error code.

3. Same error faced when I tried to import 3 to 4 notebook on shared location 

4. I tried with different different types of clusters and databricks run time also but same issues 

5. I have Admin access on databricks workspace and I am using Premium (+ Role-based access controls) databricks

6. Currently I am using another storage account for my DE Work using storage credentials I am able to access data easily in Notebooks

I'm not sure how to resolve this problem, as the root storage is inside the managed resource group of Databricks.

I would greatly appreciate if the community could provide any insights, suggestions, or solutions to help me address these issues. Your assistance would be invaluable in getting my Databricks environment back on track.

Thank you in advance for your time and support.

Best regards,
Vishal Yadav


 

1 ACCEPTED SOLUTION

Accepted Solutions

yvishal519
Contributor

Dear Databricks Community,

I wanted to share some updates regarding the issues I've been encountering in my Databricks environment.

After raising a ticket with Microsoft and collaborating with their team for approximately a week, we undertook several troubleshooting steps. Ultimately, both the storage team and the Databricks team informed us that when launching resources, the root storage account was created due to some issues, which resulted in the root container not being created. Consequently, this led to the problems we experienced. Since no one has control over the root storage, it should be created automatically. To resolve the issues, we need to re-provision the resources, after which the problems were resolved.

View solution in original post

4 REPLIES 4

-werners-
Esteemed Contributor III

hard to tell, do you use unity catalog? have you defined storage credentials/external storage etc?
The reason show() and display() give the error is because of the lazy evaluation of spark.  So the error could be somewhere in the code before those actions.
But it could also be a configuration issue.

If you use UC, you first have to make sure it is set up correctly.
If not: make sure you have mounted your storage, or at least test if you can list files (dbutils.fs.ls).
Make sure databricks has permissions to read/write to the storage account.

jacovangelder
Honored Contributor

Judging by the 404 error, its one of the following:

  • Most likely: it looks like you haven't mounted the storage correctly/successfully. Most likely the service principal used when creating the mount does not have the right RBAC access (Storage blob data reader/contributor) on the container in the storage account. 
  • The storage account is behind a firewall/private endpoint and Databricks is not able to access it. 

imsabarinath
New Contributor III

Looking at the error message, issue might be related to mounting itself or incorrect reference to the mount points..

Pls share the code snippet if you can

yvishal519
Contributor

Dear Databricks Community,

I wanted to share some updates regarding the issues I've been encountering in my Databricks environment.

After raising a ticket with Microsoft and collaborating with their team for approximately a week, we undertook several troubleshooting steps. Ultimately, both the storage team and the Databricks team informed us that when launching resources, the root storage account was created due to some issues, which resulted in the root container not being created. Consequently, this led to the problems we experienced. Since no one has control over the root storage, it should be created automatically. To resolve the issues, we need to re-provision the resources, after which the problems were resolved.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group