cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Suddenly can't find the option to uplaod files into Databricks Community Edition

Younevano
New Contributor III

Hi everyone,

I am suddenly unable to find the option to upload my files into Databricks Community Edition today. Please find the same in the screenshot attached. Is anyone else also facing this issue?

12 REPLIES 12

Vamsi_B
New Contributor II

I was about to post the same question, for some reason we are not able to find the dbfs toggle option in settings either!

Alexisqc92
New Contributor II

I'm facing the same issue, were you able to find a solution for this? or a work around?

gchandra
Databricks Employee
Databricks Employee

The workaround will be to Mount the S3 bucket and use that as your source.



~

Many thanks for this gchandra. 

Will Databricks ever bring back the upload files option into Databricks Community Edition near future? I never mounted an S3 bucket, so will have to test, teach all my students about S3 buckets, not feasible in the middle of the course. Are you able to provide easy steps to mount S3 buckets without incurring any charges please?
Thank you Sandra

gchandra
Databricks Employee
Databricks Employee
access_key = ""
secret_key = ""
encoded_secret_key = secret_key.replace("/", "%2F")

aws_bucket_name = "yourawsbucketname/"
mount_name = "youraliasmountname"

# #dbutils.fs.unmountf"/mnt/{mount_name}")
dbutils.fs.mount(f"s3a://{access_key}:{encoded_secret_key}@{aws_bucket_name}", f"/mnt/{mount_name}")

After mounting
goto next cell %fs ls /mnt/youraliasmountname



~

What about the IAM config and all that? Is the dbfs file upload permanently disabled for the community edition?

thorffin
New Contributor II

Anyone found any solution please post here

mahdi
New Contributor II

Perfect time for this to happen -- right in the middle of a class assignment due in a few days. ๐Ÿ™‚ 
The easiest way to do it is by using dbutils.fs.mv to mv the file from the local system to the dbfs. Here is a complete example:

## create file in the local fs
with open("./test_file.txt", "w") as file:
lines = "\n".join(map(str, range(5)))
file.write(lines)


!cat test_file.txt
>> 0 1 2 3 4

#mv file from local fs to dbf
dbutils.fs.mv("file:/databricks/driver/test_file.txt", "dbfs:/test_file.txt")
>> Out[10]: True


# Read the file as an RDD
file_path = "dbfs:/test_file.txt"
rdd = sc.textFile(file_path)
rdd.collect()
>> Out[11]: ['0', '1', '2', '3', '4']

 

gchandra
Databricks Employee
Databricks Employee

 

CE product roadmap is not available, so I cannot comment on that. 

I have provided 2 alternate options. AWS offers Free Tier https://aws.amazon.com/free/ up to 1 year and students can benefit from that. 

https://community.databricks.com/t5/data-engineering/databricks-community-edition-dbfs-alternative-s...

Let me know if you have any further questions.



~

gchandra
Databricks Employee
Databricks Employee

It's fixed. You can continue to use Upload.



~

gchandra
Databricks Employee
Databricks Employee

It's fixed. You can continue to use Upload.



~

Thanks gchandra, I hope it is going to stay therefor a long time? thanks Sandra

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group