cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Unable to read from or write to Snowflake Open Catalog via Databricks

Sunil_Patidar
New Contributor

I have Snowflake Iceberg tables whose metadata is stored in Snowflake Open Catalog. I am trying to read these tables from the Open Catalog and write back to the Open Catalog using Databricks.

I have explored the available documentation but havenโ€™t been able to find clear guidance on how to achieve this. One Snowflake document mentions that Apache Spark can read from and write to Snowflake Open Catalog:
https://docs.snowflake.com/en/user-guide/opencatalog/register-service-connection#examples
Based on this, Iam assuming that if Apache Spark supports read/write operations with Snowflake Open Catalog, Databricks (which is Spark-based) should also be able to do the same.

My main goal is to achieve interoperability between Snowflake and Databricks using Snowflake Open Catalog, which is compatible with the Iceberg REST Catalog.
Could someone please clarify:
Whether Databricks officially supports reading from and writing to Snowflake Open Catalog
If so, are there any reference architectures, examples, or configuration steps available?
Any guidance or pointers to the correct documentation would be greatly appreciated.

Thanks in advance!

1 REPLY 1

Louis_Frolio
Databricks Employee
Databricks Employee

Greetings @Sunil_Patidar ,  Databricks and Snowflake can interoperate cleanly around Iceberg today โ€” but how you do it matters.

At a high level, interoperability works because both platforms meet at Apache Iceberg and the Iceberg REST Catalog API.

What works today

โ€ข Snowflake can read (GA) and write (Public Preview) Unity Catalogโ€“managed Iceberg tables using Databricksโ€™ Iceberg REST Catalog implementation in Unity Catalog.

โ€ข Databricks can discover and read Iceberg tables managed by Snowflake Horizon Catalog using Unity Catalog Iceberg catalog federation (Public Preview, requires Databricks Runtime 16.4 LTS+).

โ€ข Databricks can also connect directly to Snowflake Open Catalog using the standard Apache Iceberg REST client (Spark catalog). This supports full read/write and is the most flexible option today.

Important nuance

Write-back to Snowflake Horizon via Unity Catalog federation is not explicitly documented as supported in the current Public Preview. If you need true bidirectional writes between Databricks and Snowflake-managed Iceberg tables, use the direct Spark + Iceberg REST approach rather than UC federation.

 

Recommended patterns

If Snowflake needs access to Databricks-managed tables

โ†’ Use Unity Catalogโ€™s Iceberg REST APIs with Snowflakeโ€™s REST catalog integration.

If Databricks needs read access to Snowflake Horizon tables

โ†’ Use Unity Catalog Iceberg catalog federation (DBR 16.4 LTS+).

If you need full read/write interoperability

โ†’ Configure Databricks as an Apache Iceberg Spark client pointing directly at Snowflake Open Catalog via the Iceberg REST API.

Key things to watch

โ€ข Use fully qualified catalog names in UC-enabled workspaces (donโ€™t replace spark_catalog).

โ€ข Match Iceberg runtime versions to your Databricks Runtime.

โ€ข Snowflake Open Catalog uses catalog-vended, short-lived credentials via the Iceberg REST API โ€” no long-lived cloud credentials required.

Bottom line

Iceberg interoperability between Databricks and Snowflake is real and usable today, but federation is currently best suited for discovery and read access. For bidirectional data movement, go direct via Iceberg REST.

Hope that helps, Louis.