a week ago
Hello,
Iโm encountering an issue while converting SQL code to the Lake bridge Snowflake dialect. It seems that DML and DDL statements may not be supported in the Snowflake dialect within Lake bridge.
Could you please confirm whether DML and DDL statements are supported in the Lake bridge Snowflake dialect, and if so, how they can be executed?
a week ago - last edited a week ago
Hi @jadhav_vikas ,
Looking at lakebridge source code, particularly on their test section it seems that is should support DDL and DML statements for Snowflake. Did you get any error or it just didn't work?
a week ago
Hi @szymon_dybczak
As per my understanding and testing, Lake bridge currently does not support DDL or DML statements for the Snowflake dialect.
Only read or query operations are supported at this time.
Please let me know if you have any recent updates or documentation indicating otherwise.
I have attached a snapshot that shows the converted code and the corresponding output.
a week ago
Hi@jadhav_vikas ,
I'll try to use it after work and I'll let you know if it works.
a week ago
okay sure @szymon_dybczak
Thanks for understanding
yesterday
@jadhav_vikas , I did some digging through internal docs and I have some hints/suggestions.
Databricks Lakehouse Federation (often referred to as โLakehouse Bridgeโ) provides readโonly access to Snowflake; DML and DDL are not supported when querying Snowflake through a foreign catalog.
In internal guidance, this is summarized as โcan read from Snowflake but not write to Snowflake.โLakebridge (Databricks Labs SQL transpiler/analyzer) does support Snowflake dialect constructs (including many DML/DDL) for migration and translation to Databricks SQL, but execution happens on Databricks, not Snowflake. Itโs used to parse Snowflake SQL, including DDL/DML, and transpile to DBSQL; the analyzer also extracts DDL/DML from Snowflake query history for migration sizing.
Lakehouse Federation to Snowflake (Unity Catalog โconnectionโ + โforeign catalogโ)
Lakebridge (Databricks Labs) Snowflake dialect
Databricks SQL (DBSQL) execution on Databricks
Goal: Keep data and execution on Databricks (migrating Snowflake SQL to Databricks)
-- Create target
CREATE TABLE tgt (id INT, val STRING);
-- Merge
MERGE INTO tgt t
USING src s
ON t.id = s.id
WHEN MATCHED THEN UPDATE SET t.val = s.val
WHEN NOT MATCHED THEN INSERT (id, val) VALUES (s.id, s.val);
Goal: Execute DML/DDL directly against Snowflake
Cheers, Louis.
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now