Thursday
I tried to import a notebook (.dbc) into my workspace today and this error popped up. I have never encountered this before. Is this a temporary bug or something has changed for Community Version?
Error Message:
Import failed with error: Could not deserialize: /tmp/import-stage14449922761726312769/3267549799876087/7939851744215972/849c8978_9587_42cb_ab5d_8ddfb72d44aa-[my file name]__12_-92019.dbc (No such file or directory)
Friday
The issue suggests that there might be an issue with the temporary storage used during the import process.
Here are a few steps you can take to troubleshoot and potentially resolve the issue:
Retry the Import: Sometimes, these issues can be transient. Try importing the notebook again after some time to see if the problem persists.
Export and Re-import: As a workaround, you can try exporting the notebook in a different format (e.g., as a Python file) and then re-importing it.
Check File Size: Ensure that the file size of the .dbc notebook is within the import limits. If the file is too large, you might encounter import errors.
Friday
this didn't worked for me, I tried. My file size is 9MB
Friday
Hello,
I am also facing the same issue. I have even created a new account but still the issue persists.
Tried re-export and re-import but did not work.
File size is not very large, Its 1.5MB.
Friday
I am facing the same issue
Saturday
same issue here, cannot import an ipynb file which I worked with on jupyter lab
Saturday
Same issue here i cant solve
Saturday
same issue here
Sunday
Hello, I am facing the same issue while trying to import a notebook into Databricks Community Edition. The error message is:
'Import failed with error: Could not deserialize: (No such file or directory)'.
I have already tried re-exporting and re-importing the notebook, but the issue persists. The file size is small (7.9 KB).
Sunday
Hi,
I found a work around :
Step 1: If you are using Azure or AWS create an instance of databricks workspace
Step 2: Once the workspace is ready, import your dbc(databricks Archive) file.
Step3: This will surely show all the files within the dbc
Step4: Export your notebooks as HTML so that you can easily view over browser
Step5: Delete the resource group on Azure or AWS and delete all instances on provider so that you will not get
charged
Hope this will be helpful
Sunday
Our engineering has confirmed it is expected that no file import is allowed in the community edition mode
Sunday
So importing notebooks/codes is no longer support for the community version? Do users have to ctrl+c ctrl+v the codes manually?
Sunday
Unfortunately this seems to be the actual behavior as of now.
Monday
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group