cancel
Showing results for 
Search instead for 
Did you mean: 
Data Governance
cancel
Showing results for 
Search instead for 
Did you mean: 

Error with federation for Snowflake in Databricks SQL

bob1
New Contributor III

Hello, I am trying to test the new federation for Snowflake feature in Databricks SQL, following the docs (https://docs.databricks.com/query-federation/snowflake.html).

Environment:

  • Subscription: Premium
  • Compute: DB SQL (serverless)
  • Metastore: Unity Catalog

I am trying to run the following query (but with the parameters for my own Snowflake environment, of course):

CREATE TABLE snowflake_table
USING snowflake
OPTIONS (
  dbtable '<table-name>',
  sfUrl '<database-host-url>',
  sfUser '<my_username>',
  sfPassword '<my_password>',
  sfDatabase '<database-name>',
  sfSchema '<schema-name>',
  sfWarehouse '<warehouse-name>'
);

However, I get the following error:

Unsupported operation detected in the query plan. Only Delta data source is supported for table creation or data writing.
Query plan:
CreateDataSourceTableCommand `main`.`default`.`snowflake_table`, false

Why is this happening and how can I get this working? What limitations to this new feature am I missing and need to be aware of? Thanks in advance for your guidance.

4 REPLIES 4

LandanG
Honored Contributor
Honored Contributor

Hi @Bob TheBuilder​ ,

From what I can tell it seems like a bug that will be fixed in an upcoming release. While not ideal, are you able to run that SQL code from a Notebook and see if it works?

bob1
New Contributor III

Hi @Landan George​, thanks for the quick response. I get the following error if I try to run the SQL code from a Notebook:

SQLException: No suitable driver found for jdbc:snowflake://app.snowflake.com/region/foo/

Any suggestions?

LandanG
Honored Contributor
Honored Contributor

@Bob TheBuilder​ 

Does the below work in a notebook?

spark.sql(f"""
CREATE TABLE snowflake_table
USING snowflake
OPTIONS (
  dbtable '<table-name>',
  sfUrl '<database-host-url>',
  sfUser '<my_username>',
  sfPassword '<my_password>',
  sfDatabase '<database-name>',
  sfSchema '<schema-name>',
  sfWarehouse '<warehouse-name>'
)""")

If not you may need to use this method https://docs.databricks.com/external-data/snowflake.html to temporarily read data while the DB SQL bug is fixed.

bob1
New Contributor III

@Landan George​ thanks for sharing the code snippet — unfortunately no luck. Getting the same "No suitable driver found for [...]" error.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.