โ12-13-2024 07:52 AM
Hi team,
I have a dbc file from different workspace. I am trying to upload that dbc into different workspace it says folder does not exists
Please suggest some solution
โ12-13-2024 08:06 AM
How are you running this process? You could export and import the file to the new workspace
โ12-13-2024 08:22 AM
Hi Walter, thanks for the quick response
This is an export which I did long ago from my earlier workspace. Now when I am importing into my new workspace using databricks community edition it throws error. No such file or directory
Please let me know if any other details required
โ12-13-2024 08:27 AM
Import failed with error: Could not deserialize: /tmp/import-stage2420831599065373102/2336626257607783/1306899984782095/6f8b0e51_803e_4dcc_aac1_4182719f737d-quamar_rashid_wipro_com-9b826.dbc (No such file or directory)
โ12-13-2024 09:09 AM
file size is 9.3MB
โ12-13-2024 09:38 AM
A similar error occurs for me as well. I even tried importing a very small 5KB file, but it didnโt work. This doesnโt seem to be an issue related to file size limitations.
โ12-13-2024 09:43 AM
Can you confirm if as @Takuya-Omi you also have issues importing any other file, it can be a csv or any other file type
โ12-13-2024 09:58 AM
โ12-13-2024 12:47 PM
I created a Community Edition env for testing and i am seeing same behavior will look at this internally
โ12-15-2024 06:07 AM
Hi,
I found a work around :
Step 1: If you are using Azure or AWS create an instance of databricks workspace
Step 2: Once the workspace is ready, import your dbc(databricks Archive) file.
Step3: This will surely show all the files within the dbc
Step4: Export your notebooks as HTML so that you can easily view over browser
Step5: Delete the resource group on Azure or AWS and delete all instances on provider so that you will not get
charged
Hope this will be helpful