01-17-2022 02:57 AM
Hi Folks,
I have installed and configured databricks CLI in my local machine. I tried to move a local file from my personal computer using dbfs cp to dbfs:/ path. I can see the file is copied from local, and is only visible in local. I am not able to see it in dbfs cluster via databricks UI.
Local
I'm trying to list the data in cluster. It is not available .
01-17-2022 03:18 AM
for cli command is "databricks fs cp" to copy from local storage to cluster
01-17-2022 08:57 PM
I tried databricks fs cp too @Hubert Dudek , still the issue exists. Any Idea why?
01-18-2022 11:50 AM
What distribution are you using (community, Azur?)
can you verify your connection using databricks fs ls
do you see there typical folders like user, tmp etc?
01-23-2022 06:19 AM
Hi @Hubert Dudek , I am using community edition with AWS in the backend.
If I do databricks fs ls, I only am able to see one folder, databricks-results
01-29-2022 06:12 AM
01-31-2022 05:36 AM
Issue still exists @Kaniz Fatma
02-07-2022 08:34 AM
No, @Kaniz. The syntax I was using is the same as you have mentioned. I still am not able to see the file which I copied from local to dbfs in databricks front-end
03-08-2022 04:10 AM
Hi, Could you try to save the file from your local machine to dbfs:/FileStore location?
# Put local file test.py to dbfs:/FileStore/test.py
dbfs cp test.py dbfs:/FileStore/test.py
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now