Hey Folks,
I just want to know if there is a way to mirror the Snowflake tables in Databricks , Meaning creating a table using format snowflake and give in options of table (host,user,pwd and dbtable in snowflake). I just tried it as per this code below :
/* The following example applies to Databricks Runtime 11.3 LTS and above. */
DROP TABLE IF EXISTS snowflake_table;
CREATE TABLE snowflake_table
USING snowflake
OPTIONS (
host '<hostname>',
port '<port>', /* Optional - will use default port 443 if not specified. */
user '<username>',
password '<password>',
sfWarehouse '<warehouse_name>',
database '<database-name>',
schema '<schema-name>', /* Optional - will use default schema "public" if not specified. */
dbtable '<table-name>'
);
SELECT * FROM snowflake_table;
Which raises the below error :[UC_FILE_SCHEME_FOR_TABLE_CREATION_NOT_SUPPORTED] Creating table in Unity Catalog with file scheme jdbc is not supported. Instead, please create a federated data source connection using the CREATE CONNECTION command for the same table provider, then create a catalog based on the connection with a CREATE FOREIGN CATALOG command to reference the tables therein. SQLSTATE: 0AKUC
error
As per this I checked the https://docs.databricks.com/aws/en/query-federation/snowflake?language=SQL#create-a-connection databricks docs to create foreign catalog which ideally mirrors a snowflake database , but usecase is can we do it at table level , so i'll need one catalog and all different tables under different schemas in same single catalog in databricks , Help me to understand if this is possible?
Also the different tables are from different snowflake accounts!
TIA!