cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Databricks Free Trial Help
Engage in discussions about the Databricks Free Trial within the Databricks Community. Share insights, tips, and best practices for getting started, troubleshooting issues, and maximizing the value of your trial experience to explore Databricks' capabilities effectively.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Public dbfs root is disabled ,Access is denied on path

ViratKumar1061
New Contributor III

Hi Team,
I am using Databricks free edition to run some  jobs on environment but I am getting error like : Public DBFS root is disabled. Access is denied on path: /FileStore/tables/
So how can i get access for this location, could anyone help me here. 

16 REPLIES 16

nikhilj0421
Databricks Employee
Databricks Employee

Hi @ViratKumar1061, I am not very familiar with the features provided in the Databricks free edition. But you can check settings -> Advanced -> DBFS File Browser

There will be a toggle to enable DBFS File Browser, check if it is disabled. 

nikhilj0421_0-1749951682777.png

 

 

Hi Nikhil,

I already checked in setting and there is no any toggle option to enable DBFS File broweser in databricks free edition, please help us on this we are stuck on this.

ViratKumar1061
New Contributor III

Hi Nikhil,

I already checked in setting and there is no any toggle option to enable DBFS File broweser in databricks free edition, please help us on this we are stuck on this.

vidhey1707
New Contributor III

Hi Virat, after the new Databricks update DBFS is deprecated when the community edition was changed to Free Edition few days, you can use Volume instead of filestore, try using this command for more understanding

%fs ls dbfs:/Volumes/workspace/default/
you can create a folder or external storage

%fs mkdirs /Volumes/workspace/default/<folder_name>/
and use this path for your operations

Hi vidhey, i tried this thing "%fs ls dbfs:/Volumes/" it shows 

"The maximum number of retries has been exceeded"

error

vidhey1707
New Contributor III

Try this after sometime, hopefully it should work

after sometime I have tried "%fs ls dbfs:/Volumes/"

I got below error

1 ExecutionError: (com.databricks.backend.daemon.data.common.InvalidMountException) Error while using path /Volumes for creating file system within mount at '/Volumes'.

felipevsr
New Contributor II

Hello. You can create a volume in Databricks Free Edition to replace the dbfs. I had the same problem. I managed to create a volume.

felipevsr_0-1752062121299.png

felipevsr_2-1752062217344.png

 

jazz890
New Contributor II

jazz890_0-1752602312835.png

Hi,
Still i am getting the same error..

felipevsr
New Contributor II

Hello.

The problem is that the /FileStore/tables/ path doesn't exist in Databricks Free Edition.

Even though the first part of the code indicates a DataFrame was created, it wasn't.

And when you try to save the data to the created Volume, you get an error.

I used the same path you did.

felipevsr_0-1752607083092.png

 

 

felipediassouza
New Contributor III

The same problem, I tried to load a file from these variations, and I'm receiving the same error: 

csv_file_path = "/FileStore/data-co-uk/data/raw/1920_E0.csv"
OR
csv_file_path = "/Workspace/Repos/felipedi@gmail.com/data-co-uk/data/raw/1920_E0.csv"
OR
csv_file_path = "dbfs:/Workspace/Users/felipedi@gmail.com/data-co-uk/data/raw/1920_E0.csv"



and received it:
Public DBFS root is disabled. Access is denied on path: /FileStore/data-co-uk/data/raw/1920_E0.csv

BS_THE_ANALYST
Esteemed Contributor II

Hi @ViratKumar1061, has your problem been resolved? If not, could you provide an update so we look at next steps for a solution.

All the best,
BS

felipediassouza
New Contributor III

My problem was resolved. I discovered that I was trying to access via Repos, but I changed to access via Catalog, and it worked well.

amey2220111
New Contributor II

As with new free edition DBFS is disabled, you need to import your csv file to catalog "https://www.youtube.com/watch?v=sOCQgoemQIo" refer this, then you can use this file path to access file and perform operations or .show()

amey2220111_0-1752734058573.png

 

This video is part of the series "Databricks Essentials". In this series, I will cover the fundamentals of #Databricks, #deltalake, #apachespark, #mlflow, and #unitycatalog ----------------------------------------------- Hi, I am Liping, Data & AI Solution Architect, ex-Microsoft, I specialize in