- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-18-2023 05:06 AM
To identify certain deltalake features available on a certain installation, it is important to have a robust way to identify deltalake version. For OSS, I found that the below Scala snippet will do the job.
import io.delta
println(io.delta.VERSION)
Not sure if there is an equivalent method in Python to get the version at runtime. I am not after something like pip show deltalake.
Anyway, the other aspect is that the above snippet on Databricks returns a strange value. For example on DBR 12.0, I got 1.1.0 which does not match 2.2.0 version published on DBR 12.0 release notes page.
I know that deltalake on Databricks does not necessarily have to match a certain OSS version (at lease before open sourcing delta).
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-02-2023 10:09 AM
@Yousry Mohamed - could you please check the DBR runtime release notes for the Delta lake API compatibility matrix section ( DBR version vs Delta lake compatible version) for the mapping.
Reference: https://docs.databricks.com/release-notes/runtime/releases.html#delta-api-compatibility-matrix
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-02-2023 10:09 AM
@Yousry Mohamed - could you please check the DBR runtime release notes for the Delta lake API compatibility matrix section ( DBR version vs Delta lake compatible version) for the mapping.
Reference: https://docs.databricks.com/release-notes/runtime/releases.html#delta-api-compatibility-matrix
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-02-2023 12:06 PM
Thanks Shan, really useful page with other bits of information as well.

