yesterday
yesterday
I was about to post the same question, for some reason we are not able to find the dbfs toggle option in settings either!
yesterday
I'm facing the same issue, were you able to find a solution for this? or a work around?
yesterday
The workaround will be to Mount the S3 bucket and use that as your source.
yesterday
51m ago
What about the IAM config and all that? Is the dbfs file upload permanently disabled for the community edition?
9 hours ago
Anyone found any solution please post here
35m ago
Perfect time for this to happen -- right in the middle of a class assignment due in a few days. ๐
The easiest way to do it is by using dbutils.fs.mv to mv the file from the local system to the dbfs. Here is a complete example:
## create file in the local fs
with open("./test_file.txt", "w") as file:
lines = "\n".join(map(str, range(5)))
file.write(lines)
!cat test_file.txt
>> 0 1 2 3 4
#mv file from local fs to dbf
dbutils.fs.mv("file:/databricks/driver/test_file.txt", "dbfs:/test_file.txt")
>> Out[10]: True
# Read the file as an RDD
file_path = "dbfs:/test_file.txt"
rdd = sc.textFile(file_path)
rdd.collect()
>> Out[11]: ['0', '1', '2', '3', '4']
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group