Is anyone getting up and working ? Federating Snowflake-managed Iceberg tables into Azure Databricks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-29-2025 05:06 AM
I'm federating Snowflake-managed Iceberg tables into Azure Databricks Unity Catalog to query the same data from both platforms without copying it. I am getting weird error message when query table from Databricks and i have tried to put all nicely in place and i can see that Databricks says: Data source Iceberg which is already good. Snowflake and Databricks on Azure both.
Error getting sample data Your request failed with status FAILED: [BAD_REQUEST] [DELTA_UNIFORM_INGRESS_VIOLATION.CONVERT_TO_DELTA_METADATA_FAILED] Read Delta Uniform fails: Metadata conversion from Iceberg to Delta failed, Failure to initialize configuration for storage account XXXX.blob.core.windows.net: Invalid configuration value detected for fs.azure.account.key.
Snowflake (Iceberg table owner + catalog)
Azure object storage (stores Iceberg data + metadata)
Databricks Unity Catalog (federates Snowflake catalog + enforces governance)
Databricks compute (Serverless SQL / SQL Warehouse querying the data)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-29-2025 05:57 AM
try this:
- In Snowflake, run
SELECT SYSTEM$GET_ICEBERG_TABLE_INFORMATION('<db>.<schema>.<table>');to retrieve metadata location. - In Databricks, create an external location that matches the metadata URI.
My blog: https://databrickster.medium.com/
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-29-2025 05:59 AM
Check also the behavior on both serverless and classic compute - it is catalog federation so it can work like that and needs setup as you read only metadata from catalog and later databricks is just reading data from that metadata location
My blog: https://databrickster.medium.com/
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-29-2025 10:42 AM
on classic compute you can set access to blob by sparkconf:
spark.conf.set(f"fs.azure.account.key.{account_name}.blob.core.windows.net", account_key)
My blog: https://databrickster.medium.com/
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-29-2025 10:53 PM
Thanks Hubert. I did check the Iceberg metadata location and Databricks can list the files, but the issue is that Snowflake’s Iceberg metadata.json contains paths like abfss://…@<acct>.blob.core.windows.net/..., and on UC Serverless Databricks then tries to initialize Blob access and fails with fs.azure.account.key (legacy creds), which Serverless doesn’t allow. Databricks/Azure support confirmed fallback/legacy access cannot be enabled on Serverless, so this seems blocked unless using Pro/Classic compute or Snowflake can publish DFS (…dfs.core.windows.net) paths.I’ll close this thread from my side for now — if someone has this working on Serverless on Azure without account keys, I’d still love to hear how.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-12-2026 07:10 PM - edited 02-12-2026 07:18 PM
@Hubert-Dudek @ripa1 Running into a similar issue. Were you able to get a resolution on this? Assuming spark.conf.set(f"fs.azure.account.key.{account_name}.blob.core.windows.net", account_key) should not be needed as it should authenticate via UC.