cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Connect snowflake to Databricks

adhi_databricks
New Contributor III

Hey Folks,

I just want to know if there is a way to mirror the Snowflake tables in Databricks , Meaning creating a table using format snowflake and give in options of table (host,user,pwd and dbtable in snowflake). I just tried it as per this code below : 

 

/* The following example applies to Databricks Runtime 11.3 LTS and above. */

DROP TABLE IF EXISTS snowflake_table;
CREATE TABLE snowflake_table
USING snowflake
OPTIONS (
    host '<hostname>',
    port '<port>', /* Optional - will use default port 443 if not specified. */
    user '<username>',
    password '<password>',
    sfWarehouse '<warehouse_name>',
    database '<database-name>',
    schema '<schema-name>', /* Optional - will use default schema "public" if not specified. */
    dbtable '<table-name>'
);
SELECT * FROM snowflake_table;

 

Which raises the below error :[UC_FILE_SCHEME_FOR_TABLE_CREATION_NOT_SUPPORTED] Creating table in Unity Catalog with file scheme jdbc is not supported. Instead, please create a federated data source connection using the CREATE CONNECTION command for the same table provider, then create a catalog based on the connection with a CREATE FOREIGN CATALOG command to reference the tables therein. SQLSTATE: 0AKUC
error

As per this I checked the https://docs.databricks.com/aws/en/query-federation/snowflake?language=SQL#create-a-connection databricks docs to create foreign catalog which ideally mirrors a snowflake database , but usecase is can we do it at table level , so i'll need one catalog and all different tables under different schemas in same single catalog in databricks , Help me to understand if this is possible?

Also the different tables are from different snowflake accounts!

TIA!

 

4 REPLIES 4

Alberto_Umana
Databricks Employee
Databricks Employee

Hi @adhi_databricks,

If you want to have multiple tables under different schemas/DBs in the same catalog, you need to create a foreign catalog that references each database separately and handle access and schema management appropriately in Databricks

adhi_databricks
New Contributor III

Hey @Alberto_Umana 

In Databricks, the structure is: Catalog -> Schema -> Tables,

while in Snowflake it is: Database -> Schema -> Tables.

The foreign_catalog in Databricks mirrors the Snowflake database, correct? For example, if one Snowflake account contains 10 databases, I would need to create 10 foreign catalogs in Databricks to reflect those Snowflake tables, as outlined in the documentation.

I would like to know how I can create these databases under a single foreign catalog. Additionally, if this is possible, how do I query the Snowflake tables from within Databricks?

Alberto_Umana
Databricks Employee
Databricks Employee

Hi @adhi_databricks,

You are right about foreign_catalogs. Please note that It is not possible to create multiple Snowflake databases under a single foreign catalog in Databricks directly. Each Snowflake database needs a separate foreign catalog to mirror the structure.

  1. Create a foreign catalog for each Snowflake database.
  2. Use the appropriate connectors and settings to establish connection and query the tables.
-- Example to query Snowflake table from Databricks
SELECT * FROM foreign_catalog_name.schema_name.table_name;
 

adhi_databricks
New Contributor III

Hi @Alberto_Umana , Just a QQ would we be able to change table properties like adding column details, column tagging and Column level masking on the snowflake tables that are under the foreign catalog created?

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now