cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

#data bricks snowflake dialect

jadhav_vikas
New Contributor II

Hello,
Iโ€™m encountering an issue while converting SQL code to the Lake bridge Snowflake dialect. It seems that DML and DDL statements may not be supported in the Snowflake dialect within Lake bridge.
Could you please confirm whether DML and DDL statements are supported in the Lake bridge Snowflake dialect, and if so, how they can be executed?

5 REPLIES 5

szymon_dybczak
Esteemed Contributor III

 

Hi @jadhav_vikas ,

Looking at lakebridge source code, particularly on their test section it seems that is should support DDL and DML statements for Snowflake. Did you get any error or it just didn't work?

szymon_dybczak_0-1762327133916.png

 

Hi @szymon_dybczak

As per my understanding and testing, Lake bridge currently does not support DDL or DML statements for the Snowflake dialect.
Only read or query operations are supported at this time.
Please let me know if you have any recent updates or documentation indicating otherwise.

I have attached a snapshot that shows the converted code and the corresponding output.
Screenshot 2025-11-05 125702.png

szymon_dybczak
Esteemed Contributor III

Hi@jadhav_vikas ,

I'll try to use it after work and I'll let you know if it works. 

okay sure @szymon_dybczak  
Thanks for understanding 

Louis_Frolio
Databricks Employee
Databricks Employee

@jadhav_vikas , I did some digging through internal docs and I have some hints/suggestions.

Short answer

  • Databricks Lakehouse Federation (often referred to as โ€œLakehouse Bridgeโ€) provides readโ€‘only access to Snowflake; DML and DDL are not supported when querying Snowflake through a foreign catalog.

    In internal guidance, this is summarized as โ€œcan read from Snowflake but not write to Snowflake.โ€
  • Lakebridge (Databricks Labs SQL transpiler/analyzer) does support Snowflake dialect constructs (including many DML/DDL) for migration and translation to Databricks SQL, but execution happens on Databricks, not Snowflake. Itโ€™s used to parse Snowflake SQL, including DDL/DML, and transpile to DBSQL; the analyzer also extracts DDL/DML from Snowflake query history for migration sizing.

     

Whatโ€™s supported where

  • Lakehouse Federation to Snowflake (Unity Catalog โ€œconnectionโ€ + โ€œforeign catalogโ€)

    • Read-only queries with pushdowns; no DML/DDL pushdown to Snowflake from Databricks via federation.
    • Setup and supported pushdowns are documented in the Snowflake federation pages.
  • Lakebridge (Databricks Labs) Snowflake dialect

    • Provides Snowflake-to-Databricks SQL translation and migration analysis, including support/tests for constructs like MERGE and CTAS in the Snowflake dialect, then maps them to DBSQL equivalents for execution on Databricks.
  • Databricks SQL (DBSQL) execution on Databricks

    • DBSQL supports DDL (CREATE TABLE/VIEW, ALTER, DROP, etc.) and DML (INSERT, DELETE, UPDATE, MERGE, COPY INTO, etc.) natively for Delta Lake tables.

How to execute, depending on your goal

  • Goal: Keep data and execution on Databricks (migrating Snowflake SQL to Databricks)

    • Use Lakebridge to transpile your Snowflake SQL (including DML/DDL like MERGE, CTAS) into Databricks SQL, then run on a Databricks SQL warehouse or cluster.
    • DBSQL will execute the resulting DML/DDL on Delta tables. Example of DBSQL DML/DDL that is natively supported:
      -- Create target
      CREATE TABLE tgt (id INT, val STRING);
      
      -- Merge
      MERGE INTO tgt t
      USING src s
      ON t.id = s.id
      WHEN MATCHED THEN UPDATE SET t.val = s.val
      WHEN NOT MATCHED THEN INSERT (id, val) VALUES (s.id, s.val);
  • Goal: Execute DML/DDL directly against Snowflake

    • Run them in Snowflake itself (e.g., Snowflake UI or SnowSQL CLI), which supports issuing queries, DML, and DDL natively.
       
    • From Databricks, if you need to programmatically write to Snowflake (not via federation), use the Snowflake Spark connector for read/write operations; this is separate from Lakehouse Federation and supports writes by using Snowflakeโ€™s APIs under the hood.
       

Common gotchas

  • Issuing CREATE/INSERT/UPDATE/DELETE through a Snowflake foreign catalog in Unity Catalog will failโ€”the federation path is readโ€‘only by design.
  • Lakebridge is for translation and migration; it doesnโ€™t push Snowflake DML/DDL to Snowflake. Use it to convert Snowflake SQL to DBSQL and then run on Databricks.

Recommended path

  • If you are converting SQL โ€œto Lakebridgeโ€™s Snowflake dialectโ€ to then run via Lakehouse Federation: switch strategy. Use federation only for reads, and use either Snowflake-native execution for DML/DDL or migrate/transpile to DBSQL for execution on Databricks.
     

Cheers, Louis.

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local communityโ€”sign up today to get started!

Sign Up Now