2 weeks ago
Hi all,
I am currently using the Databricks Free Edition, and trying to create a streaming table, using the following SQL script:
2 weeks ago
Hi Giuseppe, I've tried to recreate this and am unable to. Was your volume created as managed or external?
My successful steps:
CREATE OR REFRESH STREAMING TABLE sql_csv_autoloader
SCHEDULE EVERY 1 WEEK
AS
SELECT *
FROM STREAM read_files(
'/Volumes/workspace/default/example_volume/example_folder/',
format => 'CSV',
sep => ',',
header => true
);
That successed with sql_csv_autoloaded table created under workspace.default ("The operation was successfully executed.") with the data loaded.
2 weeks ago
Hi Giuseppe, I've tried to recreate this and am unable to. Was your volume created as managed or external?
My successful steps:
CREATE OR REFRESH STREAMING TABLE sql_csv_autoloader
SCHEDULE EVERY 1 WEEK
AS
SELECT *
FROM STREAM read_files(
'/Volumes/workspace/default/example_volume/example_folder/',
format => 'CSV',
sep => ',',
header => true
);
That successed with sql_csv_autoloaded table created under workspace.default ("The operation was successfully executed.") with the data loaded.
2 weeks ago
Hi,
Thanks for your response. To confirm, I created my volumes as Managed.
If I need to used External volumes to execute this script, will I need to create an external storage location (for S3 storage) in Amazon? If so, is this free?
Kinds regards
Giuseppe
2 weeks ago
Sorry, did you also manage to run this script using the Databricks Free Edition?
Thanks again
Giuseppe
2 weeks ago
Yes - I was able to do it no problem on Free, with the exact steps above, using the default supplied compute.
2 weeks ago
Hi again,
Thanks for your response.
I was trying to execute the %SQL script (hence the error), in a workspace notebook, however, you were able to execute the script in SQL Editor, that worked.
Kind regards
Giuseppe
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now