01-30-2026 01:57 PM
Hello,
I'm new to using Databricks Free Edition and I'm following the Data Ingestion with Lakeflow Connection training track. Since I'm using the free version, I don't have access to the available lab resources.
In the Auto Loader training step, I've already imported a similar CSV file into my directory, but I'm having trouble creating my Delta Live Table.
When I run the command below on my notebook:
CREATE OR REFRESH STREAMING TABLE sql_csv_autoloader SCHEDULE EVERY 1 WEEK AS SELECT * FROM STREAM read_files( '/Volumes/workspace/default/dbacademy_ecommerce/csv_files_autoloader_source', FORMAT => 'csv', SEP => ';', HEADER => true );
I get the following error:So I went to the documentation for help with this problem and I understood that I need to create a pipeline, but how to create it and when I should execute this SQL command is not clear.
My perception is that the documentation is outdated or I'm looking in the wrong place
https://docs.databricks.com/aws/en/ldp/developer/ldp-sql-ref-create-streaming-table
can someone help me?
02-01-2026 01:54 AM
Hi @pradeep_singh ,
Based on screenshot @Rafa3loneil provided - he already done this. I think the main issue here is that he tried to execute a SDP code as a regular notebook cell (1). Instead he need to intialize pipeline using "Run Pipeline" button (2.)
01-30-2026 11:28 PM
In declarative pipelines you don't run cells in the notebook. Try to use run pipeline button
01-31-2026 08:49 PM
If you havent already .
Below Catalog option of left hand side select Jobs and pipelines
Select Create a new pipeline
Point this pipeline to source from this notebook .
Run the pipeline .
Thsi is how you run a SDP code
02-01-2026 01:54 AM
Hi @pradeep_singh ,
Based on screenshot @Rafa3loneil provided - he already done this. I think the main issue here is that he tried to execute a SDP code as a regular notebook cell (1). Instead he need to intialize pipeline using "Run Pipeline" button (2.)
02-01-2026 06:04 AM
Got it . Thanks for clarifying . I guess i missed the details 😀