Friday
Hi team,
I have a dbc file from different workspace. I am trying to upload that dbc into different workspace it says folder does not exists
Please suggest some solution
Friday
How are you running this process? You could export and import the file to the new workspace
Friday
Hi Walter, thanks for the quick response
This is an export which I did long ago from my earlier workspace. Now when I am importing into my new workspace using databricks community edition it throws error. No such file or directory
Please let me know if any other details required
Friday
Import failed with error: Could not deserialize: /tmp/import-stage2420831599065373102/2336626257607783/1306899984782095/6f8b0e51_803e_4dcc_aac1_4182719f737d-quamar_rashid_wipro_com-9b826.dbc (No such file or directory)
Friday
file size is 9.3MB
Friday
A similar error occurs for me as well. I even tried importing a very small 5KB file, but it didnโt work. This doesnโt seem to be an issue related to file size limitations.
Friday
Can you confirm if as @TakuyaOmi you also have issues importing any other file, it can be a csv or any other file type
Friday
Friday
I created a Community Edition env for testing and i am seeing same behavior will look at this internally
Sunday
Hi,
I found a work around :
Step 1: If you are using Azure or AWS create an instance of databricks workspace
Step 2: Once the workspace is ready, import your dbc(databricks Archive) file.
Step3: This will surely show all the files within the dbc
Step4: Export your notebooks as HTML so that you can easily view over browser
Step5: Delete the resource group on Azure or AWS and delete all instances on provider so that you will not get
charged
Hope this will be helpful
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group