Unable to read Delta Table using external tools
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-28-2025 01:31 AM - edited 01-28-2025 01:32 AM
I am using the new credential vending API to get tokens and url for my tables in Unity Catalog.
I get the token, url and I am able to scan the folder using read_parquet, but NOT any Delta Lake functions. Not TableExists, scan_delta or delta_scan from Polars or DuckDB.
The table is written using PySpark, and no particular settings.
When reading from Polars:
DeltaError: Table metadata is invalid: Number of checkpoint files '0' is not equal to number of checkpoint metadata parts 'None'
When reading from DuckDB:
IOException: IO Error: Hit DeltaKernel FFI error (from: While trying to read from delta table:
Is there any Databricks-specifics to reading the deltatables?
When reading the _same folder_ using read_parquet, it works.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-28-2025 03:55 AM
Hello!
It sounds like you're encountering issues when trying to read Delta Lake tables using Polars and DuckDB, but not with read_parquet. This could be due to Databricks-specific configurations required for Delta Lake tables. Ensure you're using the correct format ("delta") when reading Delta tables. Additionally, verify that the transaction log folder _delta_log is present and correctly configured.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-29-2025 07:04 AM - edited 01-29-2025 07:05 AM
Yes, everything is working perfectly in Unity Catalog and Databricks. Transaction-folder is present.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-17-2025 08:01 AM
I'm also having the exact same problem as @oakhill , when trying to read from duckdb into any delta tables I get the error. Would it be possible to explain a solution of how the delta extension from duckdb can be working in databricks with Delta Lake, please?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
an hour ago
When copying a problematic delta table and reading the copy the issue disappear, it seems to be related to the new delta checkpointPolicy (v2) not supported by the rust implementation of delta but fine with the scala/java one (deltalake vs delta-spark).

