cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Not able to move files from local to dbfs through dbfs CLI

study_community
New Contributor III

Hi Folks,

I have installed and configured databricks CLI in my local machine. I tried to move a local file from my personal computer using dbfs cp to dbfs:/ path. I can see the file is copied from local, and is only visible in local. I am not able to see it in dbfs cluster via databricks UI.

image 

Local

I'm trying to list the data in cluster. It is not available .

image 

1 ACCEPTED SOLUTION

Accepted Solutions

Hi @Thulasitharan Govindaraj​ ,

The DBFS command-line interface (CLI) uses the DBFS API to expose an easy to use the command-line interface to DBFS. Using this client, you can interact with DBFS using commands similar to those you use on a Unix command line. For example:

# List files in DBFS
dbfs ls
# Put local file ./apple.txt to dbfs:/apple.txt
dbfs cp ./apple.txt dbfs:/apple.txt
# Get dbfs:/apple.txt and save to local file ./apple.txt
dbfs cp dbfs:/apple.txt ./apple.txt
# Recursively put local dir ./banana to dbfs:/banana
dbfs cp -r ./banana dbfs:/banana

Screenshot 2022-02-01 at 3.29.47 PMReference

View solution in original post

13 REPLIES 13

Hubert-Dudek
Esteemed Contributor III

for cli command is "databricks fs cp" to copy from local storage to cluster

image.png

study_community
New Contributor III

I tried databricks fs cp too @Hubert Dudek​  , still the issue exists. Any Idea why?

Hubert-Dudek
Esteemed Contributor III

What distribution are you using (community, Azur?)

can you verify your connection using databricks fs ls

do you see there typical folders like user, tmp etc?

image.png

Hi @Hubert Dudek​  , I am using community edition with AWS in the backend.

If I do databricks fs ls, I only am able to see one folder, databricks-results

image 

Okay Sure. I'll use upload then

Hi @Thulasitharan Govindaraj​ , Does the issue persist or is it resolved?

Issue still exists @Kaniz Fatma​ 

Hi @Thulasitharan Govindaraj​ ,

The DBFS command-line interface (CLI) uses the DBFS API to expose an easy to use the command-line interface to DBFS. Using this client, you can interact with DBFS using commands similar to those you use on a Unix command line. For example:

# List files in DBFS
dbfs ls
# Put local file ./apple.txt to dbfs:/apple.txt
dbfs cp ./apple.txt dbfs:/apple.txt
# Get dbfs:/apple.txt and save to local file ./apple.txt
dbfs cp dbfs:/apple.txt ./apple.txt
# Recursively put local dir ./banana to dbfs:/banana
dbfs cp -r ./banana dbfs:/banana

Screenshot 2022-02-01 at 3.29.47 PMReference

Hi @Thulasitharan Govindaraj​ , Were you able to resolve your problem through the solution I provided above?

No, @Kaniz. The syntax I was using is the same as you have mentioned. I still am not able to see the file which I copied from local to dbfs in databricks front-end

Hi @Thulasitharan Govindaraj​ , Did you install and configure Azure Databricks CLI properly? Would you mind trying again?

Hi @Thulasitharan Govindaraj​ , What is the size of the file which you're trying to copy from the local to dbfs?

Anonymous
Not applicable

Hi, Could you try to save the file from your local machine to dbfs:/FileStore location?

# Put local file test.py to dbfs:/FileStore/test.py

dbfs cp test.py dbfs:/FileStore/test.py

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.