Error with federation for Snowflake in Databricks SQL
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-02-2022 08:48 AM
Hello, I am trying to test the new federation for Snowflake feature in Databricks SQL, following the docs (https://docs.databricks.com/query-federation/snowflake.html).
Environment:
- Subscription: Premium
- Compute: DB SQL (serverless)
- Metastore: Unity Catalog
I am trying to run the following query (but with the parameters for my own Snowflake environment, of course):
CREATE TABLE snowflake_table
USING snowflake
OPTIONS (
dbtable '<table-name>',
sfUrl '<database-host-url>',
sfUser '<my_username>',
sfPassword '<my_password>',
sfDatabase '<database-name>',
sfSchema '<schema-name>',
sfWarehouse '<warehouse-name>'
);
However, I get the following error:
Unsupported operation detected in the query plan. Only Delta data source is supported for table creation or data writing.
Query plan:
CreateDataSourceTableCommand `main`.`default`.`snowflake_table`, false
Why is this happening and how can I get this working? What limitations to this new feature am I missing and need to be aware of? Thanks in advance for your guidance.
- Labels:
-
SQL
-
Unity Catalog
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-02-2022 01:49 PM
Hi @Bob TheBuilder ,
From what I can tell it seems like a bug that will be fixed in an upcoming release. While not ideal, are you able to run that SQL code from a Notebook and see if it works?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-02-2022 01:54 PM
Hi @Landan George, thanks for the quick response. I get the following error if I try to run the SQL code from a Notebook:
SQLException: No suitable driver found for jdbc:snowflake://app.snowflake.com/region/foo/
Any suggestions?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-02-2022 02:43 PM
@Bob TheBuilder
Does the below work in a notebook?
spark.sql(f"""
CREATE TABLE snowflake_table
USING snowflake
OPTIONS (
dbtable '<table-name>',
sfUrl '<database-host-url>',
sfUser '<my_username>',
sfPassword '<my_password>',
sfDatabase '<database-name>',
sfSchema '<schema-name>',
sfWarehouse '<warehouse-name>'
)""")
If not you may need to use this method https://docs.databricks.com/external-data/snowflake.html to temporarily read data while the DB SQL bug is fixed.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-03-2022 09:12 AM
@Landan George thanks for sharing the code snippet — unfortunately no luck. Getting the same "No suitable driver found for [...]" error.

