2 weeks ago
I tried to import a notebook (.dbc) into my workspace today and this error popped up. I have never encountered this before. Is this a temporary bug or something has changed for Community Version?
Error Message:
Import failed with error: Could not deserialize: /tmp/import-stage14449922761726312769/3267549799876087/7939851744215972/849c8978_9587_42cb_ab5d_8ddfb72d44aa-[my file name]__12_-92019.dbc (No such file or directory)
a week ago
The issue suggests that there might be an issue with the temporary storage used during the import process.
Here are a few steps you can take to troubleshoot and potentially resolve the issue:
Retry the Import: Sometimes, these issues can be transient. Try importing the notebook again after some time to see if the problem persists.
Export and Re-import: As a workaround, you can try exporting the notebook in a different format (e.g., as a Python file) and then re-importing it.
Check File Size: Ensure that the file size of the .dbc notebook is within the import limits. If the file is too large, you might encounter import errors.
a week ago
this didn't worked for me, I tried. My file size is 9MB
a week ago
Hello,
I am also facing the same issue. I have even created a new account but still the issue persists.
Tried re-export and re-import but did not work.
File size is not very large, Its 1.5MB.
a week ago
I am facing the same issue
a week ago
same issue here, cannot import an ipynb file which I worked with on jupyter lab
a week ago
Same issue here i cant solve
a week ago
same issue here
a week ago
Hello, I am facing the same issue while trying to import a notebook into Databricks Community Edition. The error message is:
'Import failed with error: Could not deserialize: (No such file or directory)'.
I have already tried re-exporting and re-importing the notebook, but the issue persists. The file size is small (7.9 KB).
a week ago
Hi,
I found a work around :
Step 1: If you are using Azure or AWS create an instance of databricks workspace
Step 2: Once the workspace is ready, import your dbc(databricks Archive) file.
Step3: This will surely show all the files within the dbc
Step4: Export your notebooks as HTML so that you can easily view over browser
Step5: Delete the resource group on Azure or AWS and delete all instances on provider so that you will not get
charged
Hope this will be helpful
a week ago
Our engineering has confirmed it is expected that no file import is allowed in the community edition mode
a week ago
So importing notebooks/codes is no longer support for the community version? Do users have to ctrl+c ctrl+v the codes manually?
a week ago
Unfortunately this seems to be the actual behavior as of now.
a week ago
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group