cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Unable to use Auto Loader for External Location in Community Edition

AanchalSoni
New Contributor II

Using Community Edition

I'm trying to create a pipeline(streaming) using auto loader (accessing external location) and each time my select query is thrown this error- " An error occurred while calling t.analyzeAndFormatResult. : java.lang.UnsupportedOperationException: Public DBFS root is disabled. Access is denied on path: /local_disk0/tmp/autoloader_schemas_DLTAnalysisID-ec51b16e-3b7e-37ba-a57d-eb79d79a4ab84641536424733736813/1280625787"

Is it not possible to use Autoloader to access external storage in community version? Please guide.

1 ACCEPTED SOLUTION

Accepted Solutions

szymon_dybczak
Esteemed Contributor III

Ok, so this is a common problem with Free Edition. Since DBFS root is disabled on Free Edition to use Autloader you need to use Volumes. So all your source file that you want to load along with autloader schema and checkpoint location should be defined on Volume.

You can follow below guide which deals with exact same issue as yours. Just replay the same steps as the author of the article and it will work ๐Ÿ˜‰

When DBFS Is Not an Option: How Databricks Volumes Saved the Day for File-Based Learning on Free Edi...

View solution in original post

5 REPLIES 5

szymon_dybczak
Esteemed Contributor III

Hi @AanchalSoni ,

To make it clear - are you using Free Edition or Community Edition? Because people confuses them all the time so it's better to clarify this first.

I believe I'm on free edition. (I have access to volumes, Genie, etc. DBFS option is not available). Sharing an image for reference.

szymon_dybczak
Esteemed Contributor III

Ok, so this is a common problem with Free Edition. Since DBFS root is disabled on Free Edition to use Autloader you need to use Volumes. So all your source file that you want to load along with autloader schema and checkpoint location should be defined on Volume.

You can follow below guide which deals with exact same issue as yours. Just replay the same steps as the author of the article and it will work ๐Ÿ˜‰

When DBFS Is Not an Option: How Databricks Volumes Saved the Day for File-Based Learning on Free Edi...

Thanks for your guidance ๐Ÿ™‚

szymon_dybczak
Esteemed Contributor III

No problem @AanchalSoni , I'm glad that I could help ๐Ÿ˜‰

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local communityโ€”sign up today to get started!

Sign Up Now