cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Was DBFS disabled from community edition?

JuanSeValencia
New Contributor III

Hello
I'm trying to use the Upload data to DBFS... from Databricks community edition but it's disabled. I'm also trying to activate it using the path Settings>Advanced>Other but the option is not longer in the list. Is this a temporary or permanent modification in the community edition? 

1 ACCEPTED SOLUTION

Accepted Solutions

Shanku_Niyogi
Contributor

Hey folks - sorry, this was due to an erroneous rollout, it's back now. Sorry about the disruption. Thanks for continuing to use Community Edition!

View solution in original post

18 REPLIES 18

szymon_dybczak
Contributor III

Hi @JuanSeValencia ,

Can confirm. I can't find it either. 

Prasanth05
New Contributor III

Yes. I can confirm that I cannot see dbfs and cannot see option to enable dbfs file browser. Strange..

staniopolis
New Contributor III

However, you can access it using dbutils.fs or %fs commands

 

DineshReddyN
New Contributor II

I can't find it either.

Here you are:

staniopolis_0-1729165112957.png

staniopolis_2-1729165323544.png

 

Chikke
New Contributor III

But will it be possible to find the DBFS by this?.

 

Chikke
New Contributor III

Hello
I'm trying to use the Upload data to DBFS... from Databricks community edition but it's disabled. I'm also trying to activate it using the path Settings>Advanced>Other but the option is not longer in the list. Is this a temporary or permanent modification in the community edition? 

llavanway
New Contributor II

I am also having this problem as of today. I was able to upload to DBFS by clicking File --> Upload data to DBFS a few days ago. Now Uploaded data to DBFS is greyed out.

SJK
New Contributor II

Same here, I found the files I had uploaded using the %fs ls <path> command, but I do not know how to upload more files. Nor are the toggle-able options available in community edition

BREN
New Contributor II

Same Here.

Manikandan
New Contributor II

I'm also having a same problem. is there any other way to add a files in DBFS?

adihc
New Contributor II

Looking for answer on the above point only. 

Question to the person who is showing the notebook way, if they can provide a documentation or link by which we can upload a file to there? 

I know how to access the files over there, however not sure on the upload part

gchandra
Databricks Employee
Databricks Employee

The workaround will be to Mount the S3 bucket and use that as your source.

 

access_key = ""
secret_key = ""
encoded_secret_key = secret_key.replace("/""%2F")

 

aws_bucket_name = "yourawsbucketname/"
mount_name = "youraliasmountname"

 

# #dbutils.fs.unmountf"/mnt/{mount_name}")
dbutils.fs.mount(f"s3a://{access_key}:{encoded_secret_key}@{aws_bucket_name}"f"/mnt/{mount_name}")

After mounting
goto next cell %fs ls /mnt/youraliasmountname


~

jhonCostaDev
New Contributor II

Hi guys, I have the same problem but I found a solution. sorry about my english but i'll try to explain how to acess the files:

# First using the dbutils command to find a folder or file.

files = dbutils.fs.ls('/FileStore/')
# second copying from FileStore to folder into a cluster.
for file in files:
dbutils.fs.cp(file.path, f'file:/tmp/{file.name}')
 
it's done. Now you can acess the files.
files will be deleted after the cluster is shut down

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group