cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Databricks Free Trial Help
Engage in discussions about the Databricks Free Trial within the Databricks Community. Share insights, tips, and best practices for getting started, troubleshooting issues, and maximizing the value of your trial experience to explore Databricks' capabilities effectively.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Upload dbc file from another workspace

RashidQuamarCog
New Contributor

Hi team,

I have a dbc file from different workspace. I am trying to upload that dbc into different workspace it says folder does not exists

Please suggest some solution

 

9 REPLIES 9

Walter_C
Databricks Employee
Databricks Employee

How are you running this process? You could export and import the file to the new workspace

RashidQuamarCog
New Contributor

Hi Walter, thanks for the quick response

This is an export which I did long ago from my earlier workspace. Now when I am importing into my new workspace using databricks community edition it throws error. No such file or directory

Please let me know if any other details required

RashidQuamarCog
New Contributor

Import failed with error: Could not deserialize: /tmp/import-stage2420831599065373102/2336626257607783/1306899984782095/6f8b0e51_803e_4dcc_aac1_4182719f737d-quamar_rashid_wipro_com-9b826.dbc (No such file or directory)

RashidQuamarCog
New Contributor

file size is 9.3MB

TakuyaOmi
Contributor II

A similar error occurs for me as well. I even tried importing a very small 5KB file, but it didnโ€™t work. This doesnโ€™t seem to be an issue related to file size limitations.

Walter_C
Databricks Employee
Databricks Employee

Can you confirm if as @TakuyaOmi you also have issues importing any other file, it can be a csv or any other file type

TakuyaOmi
Contributor II
I tested all supported file formats, and the same error occurred with each one.

Walter_C
Databricks Employee
Databricks Employee

I created a Community Edition env for testing and i am seeing same behavior will look at this internally

 

RashidQuamarCog
New Contributor

Hi,

I found a work around :

Step 1: If you are using Azure or AWS create an instance of databricks workspace

Step 2: Once the workspace is ready, import your dbc(databricks Archive) file.

Step3: This will surely show all the files within the dbc

Step4: Export your notebooks as HTML so that you can easily view over browser

Step5: Delete the resource group on Azure or AWS and delete all instances on provider so that you will not get

charged

Hope this will be helpful

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group