โ07-10-2025 09:15 PM
FL_DATAFRAME = spark.read.format("csv")\
.option("header", "false")\
.option("inferschema", "false")\
.option("mode", "FAILFAST")\
.load("file:/databricks/driver/2010_summary.csv")
FL_DATAFRAME.show(5)
ExecutionError: Public DBFS root is disabled. Access is denied on path: /FileStore
many of the people out there are having this error and we can't read file just because of this can you please suggest what to do for the same in databricks free edition and this info is not available anywhere i have tried gpt,youtube everything nothings works
if anyone can guide me how do i load and read csv step by step it would be really helpful
โ07-11-2025 12:00 AM
Hi @na_ra_7 ,
In Free Edition dbfs is disabled. I doubt that above approach will work. But you should use Unity Catalog for that purpose anyway. DBFS is depracated pattern of interacting with storage.
So, to use volume perform following steps:
Go to Catalgos (1) -> Click workspace catalog (2) -> Click default schema -> Clikc Create button (3)
On the Create button (3) you will have an option to create volume. Pick a name and then create volume.
If you did that, your new volume should appear in Unity Catalog under default schema. Now you will have an option to upload file to Volume:
And here's an example of how to read csv from volume into dataframe:
โ07-10-2025 09:49 PM
Hey there,
Hereโs a simple step-by-step way to load a CSV in Databricks Free Edition:
Step 1: Upload the file to your workspace (not DBFS)
This will upload it into the notebook's local directory โ not /FileStore.
Step 2: Use the correct path to read it
Once uploaded, the file path will look something like this:
"/Workspace/Users/your-email-or-folder-name/2010_summary.csv"
But Spark doesnโt read from /Workspace. So hereโs a better way:
Use Databricks Utilities to access it:
uploaded_path = "dbfs:/tmp/2010_summary.csv"
dbutils.fs.cp("file:/Workspace/Users/your-folder-name/2010_summary.csv", uploaded_path)
Now read it with Spark:
df = spark.read.format("csv").option("header", "false").load(uploaded_path)
df.show(5)
โ07-11-2025 12:00 AM
Hi @na_ra_7 ,
In Free Edition dbfs is disabled. I doubt that above approach will work. But you should use Unity Catalog for that purpose anyway. DBFS is depracated pattern of interacting with storage.
So, to use volume perform following steps:
Go to Catalgos (1) -> Click workspace catalog (2) -> Click default schema -> Clikc Create button (3)
On the Create button (3) you will have an option to create volume. Pick a name and then create volume.
If you did that, your new volume should appear in Unity Catalog under default schema. Now you will have an option to upload file to Volume:
And here's an example of how to read csv from volume into dataframe:
โ07-11-2025 12:43 AM
hi @szymon_dybczak,
thank you very much for the unity catg solution
it worked i have been struggling for last 3 days
thanks again:)
โ07-11-2025 12:58 AM
Hi @na_ra_7 ,
No problem, happy to help ๐
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now