Setting up a snowflake catalog via spark config next to unity catalog
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-27-2023 05:10 AM
Im trying to set up a connection to Iceberg on S3 via Snowflake as described https://medium.com/snowflake/how-to-integrate-databricks-with-snowflake-managed-iceberg-tables-7a889... and https://docs.snowflake.com/en/user-guide/tables-iceberg-catalog
On a cluster with Unity Catalog disabled this works fine and I can read from the tables without connecting to a snowflake warehouse. When Unity Catalog is enabled however, the `snowflake_catalog` is nowhere to be found.
Is there a way to setup a catalog via Spark config like described above while using Unity Catalog?
Thanks in advance!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-20-2024 06:15 AM
Hi Kaniz,
Thanks for your elaborate reply! The iceberg data that we have to interact with uses Snowflake as catalog, not glue. Can we also use a catalog integration with Snowflake?
Thanks!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-04-2024 12:13 AM
Hi @Retired_mod ,
We've been working on setting up Glue as catalog, which is working fine so far. However, Glue takes place of the hive_metastore, which appears to be a legacy way of setting this up. Is the way proposed here the recommended way to set it up, or is there a more "unity catalog" way of setting it up? https://docs.databricks.com/en/archive/external-metastores/aws-glue-metastore.html
Thanks in advance,
Laurens

