cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Was DBFS disabled from community edition?

JuanSeValencia
New Contributor

Hello
I'm trying to use the Upload data to DBFS... from Databricks community edition but it's disabled. I'm also trying to activate it using the path Settings>Advanced>Other but the option is not longer in the list. Is this a temporary or permanent modification in the community edition? 

15 REPLIES 15

szymon_dybczak
Contributor III

Hi @JuanSeValencia ,

Can confirm. I can't find it either. 

Prasanth05
New Contributor

Yes. I can confirm that I cannot see dbfs and cannot see option to enable dbfs file browser. Strange..

staniopolis
New Contributor

However, you can access it using dbutils.fs or %fs commands

 

DineshReddyN
New Contributor

I can't find it either.

Here you are:

staniopolis_0-1729165112957.png

staniopolis_2-1729165323544.png

 

But will it be possible to find the DBFS by this?.

 

Chikke
New Contributor

Hello
I'm trying to use the Upload data to DBFS... from Databricks community edition but it's disabled. I'm also trying to activate it using the path Settings>Advanced>Other but the option is not longer in the list. Is this a temporary or permanent modification in the community edition? 

llavanway
New Contributor

I am also having this problem as of today. I was able to upload to DBFS by clicking File --> Upload data to DBFS a few days ago. Now Uploaded data to DBFS is greyed out.

SJK
New Contributor

Same here, I found the files I had uploaded using the %fs ls <path> command, but I do not know how to upload more files. Nor are the toggle-able options available in community edition

BREN
Visitor

Same Here.

Manikandan
Visitor

I'm also having a same problem. is there any other way to add a files in DBFS?

adihc
Visitor

Looking for answer on the above point only. 

Question to the person who is showing the notebook way, if they can provide a documentation or link by which we can upload a file to there? 

I know how to access the files over there, however not sure on the upload part

gchandra
Databricks Employee
Databricks Employee

The workaround will be to Mount the S3 bucket and use that as your source.

 

access_key = ""
secret_key = ""
encoded_secret_key = secret_key.replace("/""%2F")

 

aws_bucket_name = "yourawsbucketname/"
mount_name = "youraliasmountname"

 

# #dbutils.fs.unmountf"/mnt/{mount_name}")
dbutils.fs.mount(f"s3a://{access_key}:{encoded_secret_key}@{aws_bucket_name}"f"/mnt/{mount_name}")

After mounting
goto next cell %fs ls /mnt/youraliasmountname


~

jhonCostaDev
Visitor

Hi guys, I have the same problem but I found a solution. sorry about my english but i'll try to explain how to acess the files:

# First using the dbutils command to find a folder or file.

files = dbutils.fs.ls('/FileStore/')
# second copying from FileStore to folder into a cluster.
for file in files:
dbutils.fs.cp(file.path, f'file:/tmp/{file.name}')
 
it's done. Now you can acess the files.
files will be deleted after the cluster is shut down

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group