cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Import failed with error: Could not deserialize (No such file or directory)

MJ_BE8
New Contributor III

I tried to import a notebook (.dbc) into my workspace today and this error popped up. I have never encountered this before. Is this a temporary bug or something has changed for Community Version?

 

Error Message:

Import failed with error: Could not deserialize: /tmp/import-stage14449922761726312769/3267549799876087/7939851744215972/849c8978_9587_42cb_ab5d_8ddfb72d44aa-[my file name]__12_-92019.dbc (No such file or directory)

13 REPLIES 13

Walter_C
Databricks Employee
Databricks Employee

The issue suggests that there might be an issue with the temporary storage used during the import process.

Here are a few steps you can take to troubleshoot and potentially resolve the issue:

  1. Retry the Import: Sometimes, these issues can be transient. Try importing the notebook again after some time to see if the problem persists.

  2. Export and Re-import: As a workaround, you can try exporting the notebook in a different format (e.g., as a Python file) and then re-importing it. 

  3. Check File Size: Ensure that the file size of the .dbc notebook is within the import limits. If the file is too large, you might encounter import errors.

this didn't worked for me, I tried. My file size is 9MB

shetesaurabh03
New Contributor II

Hello,

I am also facing the same issue. I have even created a new account but still the issue persists.

Tried re-export and re-import but did not work. 

File size is not very large, Its 1.5MB.

RashidQuamarCog
New Contributor II

I am facing the same issue 

bogere
New Contributor II

same issue here, cannot import an ipynb file which I worked with on jupyter lab

 

massaro__
New Contributor II

Same issue here i cant solve

 

alifaily
New Contributor II

same issue here 

alifaily_0-1734204405731.png

 

Wellington8962
New Contributor II

Hello, I am facing the same issue while trying to import a notebook into Databricks Community Edition. The error message is:
'Import failed with error: Could not deserialize: (No such file or directory)'.

I have already tried re-exporting and re-importing the notebook, but the issue persists. The file size is small (7.9 KB).

Hi,

I found a work around :

Step 1: If you are using Azure or AWS create an instance of databricks workspace

Step 2: Once the workspace is ready, import your dbc(databricks Archive) file.

Step3: This will surely show all the files within the dbc

Step4: Export your notebooks as HTML so that you can easily view over browser

Step5: Delete the resource group on Azure or AWS and delete all instances on provider so that you will not get

charged

Hope this will be helpful

Walter_C
Databricks Employee
Databricks Employee

Our engineering has confirmed it is expected that no file import is allowed in the community edition mode

MJ_BE8
New Contributor III

So importing notebooks/codes is no longer support for the community version? Do users have to ctrl+c ctrl+v the codes manually?

Walter_C
Databricks Employee
Databricks Employee

Unfortunately this seems to be the actual behavior as of now.

mskulkarni1610
New Contributor II

Importing .DBC file is not working in Community edition. Please guide if we can resolve this issue. Thanks.

 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group