cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Error when trying to Create a STREAMING TABLE using Databricks SQL

giuseppe_esq
New Contributor III

Hi all,

I am currently using the Databricks Free Edition, and trying to create a streaming table, using the following SQL script:

%sql
CREATE OR REFRESH STREAMING TABLE sql_csv_autoloader
SCHEDULE EVERY 1 WEEK
AS
SELECT *
FROM STREAM read_files(
  '/Volumes/workspace/sql_using_auto_loader/autoloader_source',
  format => 'CSV',
  sep => ',',
  header => true
);
 
However, when I execute the script, I get the following error message:
UnsupportedOperationException: Public DBFS root is disabled. Access is denied on path: /local_disk0/tmp/autoloader_schemas_DLTAnalysisID-5771
 
Does anyone have any workarounds for this please?
 
Thanks
Giuseppe
1 ACCEPTED SOLUTION

Accepted Solutions

WayneRevenite
New Contributor III

Hi Giuseppe, I've tried to recreate this and am unable to. Was your volume created as managed or external?

My successful steps:

  1. Created a new Volume in workspace.default called "example_volume"
  2. Created a new folder in that volume called "example_folder" and uploaded a CSV to the folder
  3. Ran the following in a query window, using the Serverless Starter Warehouse: 
CREATE OR REFRESH STREAMING TABLE sql_csv_autoloader
SCHEDULE EVERY 1 WEEK
AS
SELECT *
FROM STREAM read_files(
  '/Volumes/workspace/default/example_volume/example_folder/',
  format => 'CSV',
  sep => ',',
  header => true
);

That successed with sql_csv_autoloaded table created under workspace.default ("The operation was successfully executed.") with the data loaded.

View solution in original post

5 REPLIES 5

WayneRevenite
New Contributor III

Hi Giuseppe, I've tried to recreate this and am unable to. Was your volume created as managed or external?

My successful steps:

  1. Created a new Volume in workspace.default called "example_volume"
  2. Created a new folder in that volume called "example_folder" and uploaded a CSV to the folder
  3. Ran the following in a query window, using the Serverless Starter Warehouse: 
CREATE OR REFRESH STREAMING TABLE sql_csv_autoloader
SCHEDULE EVERY 1 WEEK
AS
SELECT *
FROM STREAM read_files(
  '/Volumes/workspace/default/example_volume/example_folder/',
  format => 'CSV',
  sep => ',',
  header => true
);

That successed with sql_csv_autoloaded table created under workspace.default ("The operation was successfully executed.") with the data loaded.

Hi,

Thanks for your response.  To confirm, I created my volumes as Managed.  

If I need to used External volumes to execute this script, will I need to create an external storage location (for S3 storage) in Amazon?  If so, is this free?

Kinds regards

Giuseppe

Sorry, did you also manage to run this script using the Databricks Free Edition?

Thanks again

Giuseppe

WayneRevenite
New Contributor III

Yes - I was able to do it no problem on Free, with the exact steps above, using the default supplied compute.

Hi again,

Thanks for your response.

I was trying to execute the %SQL script (hence the error), in a workspace notebook, however, you were able to execute the script in SQL Editor, that worked.

Kind regards

Giuseppe

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now