WayneRevenite
Databricks Partner

Hi Giuseppe, I've tried to recreate this and am unable to. Was your volume created as managed or external?

My successful steps:

  1. Created a new Volume in workspace.default called "example_volume"
  2. Created a new folder in that volume called "example_folder" and uploaded a CSV to the folder
  3. Ran the following in a query window, using the Serverless Starter Warehouse: 
CREATE OR REFRESH STREAMING TABLE sql_csv_autoloader
SCHEDULE EVERY 1 WEEK
AS
SELECT *
FROM STREAM read_files(
  '/Volumes/workspace/default/example_volume/example_folder/',
  format => 'CSV',
  sep => ',',
  header => true
);

That successed with sql_csv_autoloaded table created under workspace.default ("The operation was successfully executed.") with the data loaded.

View solution in original post