โ12-12-2024 08:24 PM
I tried to import a notebook (.dbc) into my workspace today and this error popped up. I have never encountered this before. Is this a temporary bug or something has changed for Community Version?
Error Message:
Import failed with error: Could not deserialize: /tmp/import-stage14449922761726312769/3267549799876087/7939851744215972/849c8978_9587_42cb_ab5d_8ddfb72d44aa-[my file name]__12_-92019.dbc (No such file or directory)
โ12-13-2024 03:38 AM
The issue suggests that there might be an issue with the temporary storage used during the import process.
Here are a few steps you can take to troubleshoot and potentially resolve the issue:
Retry the Import: Sometimes, these issues can be transient. Try importing the notebook again after some time to see if the problem persists.
Export and Re-import: As a workaround, you can try exporting the notebook in a different format (e.g., as a Python file) and then re-importing it.
Check File Size: Ensure that the file size of the .dbc notebook is within the import limits. If the file is too large, you might encounter import errors.
โ12-13-2024 08:01 AM
this didn't worked for me, I tried. My file size is 9MB
โ12-13-2024 07:33 AM
Hello,
I am also facing the same issue. I have even created a new account but still the issue persists.
Tried re-export and re-import but did not work.
File size is not very large, Its 1.5MB.
โ12-13-2024 07:57 AM
I am facing the same issue
โ12-14-2024 03:27 AM
same issue here, cannot import an ipynb file which I worked with on jupyter lab
โ12-14-2024 05:01 AM
Same issue here i cant solve
โ12-14-2024 11:26 AM
same issue here
โ12-15-2024 02:43 AM
Hello, I am facing the same issue while trying to import a notebook into Databricks Community Edition. The error message is:
'Import failed with error: Could not deserialize: (No such file or directory)'.
I have already tried re-exporting and re-importing the notebook, but the issue persists. The file size is small (7.9 KB).
โ12-15-2024 06:05 AM
Hi,
I found a work around :
Step 1: If you are using Azure or AWS create an instance of databricks workspace
Step 2: Once the workspace is ready, import your dbc(databricks Archive) file.
Step3: This will surely show all the files within the dbc
Step4: Export your notebooks as HTML so that you can easily view over browser
Step5: Delete the resource group on Azure or AWS and delete all instances on provider so that you will not get
charged
Hope this will be helpful
โ12-15-2024 07:03 PM
Our engineering has confirmed it is expected that no file import is allowed in the community edition mode
โ12-15-2024 07:13 PM
So importing notebooks/codes is no longer support for the community version? Do users have to ctrl+c ctrl+v the codes manually?
โ12-15-2024 07:20 PM
Unfortunately this seems to be the actual behavior as of now.
โ12-16-2024 04:26 AM
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now