I have Snowflake Iceberg tables whose metadata is stored in Snowflake Open Catalog. I am trying to read these tables from the Open Catalog and write back to the Open Catalog using Databricks.
I have explored the available documentation but havenโt been able to find clear guidance on how to achieve this. One Snowflake document mentions that Apache Spark can read from and write to Snowflake Open Catalog:
https://docs.snowflake.com/en/user-guide/opencatalog/register-service-connection#examples
Based on this, Iam assuming that if Apache Spark supports read/write operations with Snowflake Open Catalog, Databricks (which is Spark-based) should also be able to do the same.
My main goal is to achieve interoperability between Snowflake and Databricks using Snowflake Open Catalog, which is compatible with the Iceberg REST Catalog.
Could someone please clarify:
Whether Databricks officially supports reading from and writing to Snowflake Open Catalog
If so, are there any reference architectures, examples, or configuration steps available?
Any guidance or pointers to the correct documentation would be greatly appreciated.
Thanks in advance!