โ03-14-2023 03:48 AM
Hello everyone,
From a list of folders on s3, how can I filter which ones are Delta tables, without trying to read each one at a time?
Thanks,
Ovi
โ03-14-2023 05:35 AM
This is an elegant one, it uses isDeltaTable() from DeltaTableUtils and dbutils.
import org.apache.spark.sql.delta.DeltaTableUtils
val s3Path = "s3://my-bucket/my-folder"
// Get a list of all the folders in the S3 path
val folders = dbutils.fs.ls(s3Path).map(_.path)
// Filter out any folders that are Delta tables
val nonDeltaFolders = folders.filter(path => !DeltaTableUtils.isDeltaTable(path))
// Print the resulting list of folders
nonDeltaFolders.foreach(println)
Note: this is also available in python (package DeltaTableUtils is DeltaTable)
Thanks & Regards,
Nandini
โ03-14-2023 04:13 AM
Hello @Ovidiu Eremiaโ ,
To filter which folders on S3 contain Delta tables, you can look for the specific files that are associated with Delta tables. Delta Lake stores its metadata in a hidden folder named
_delta_log, which is located at the root of the Delta table. So, you can check for this folder.
In the below code, we first get the S3 bucket and the objects under the specified prefix. We then filter out only those objects that represent Delta tables by checking if their keys end with _delta_log/. Finally, we extract the folder names from the Delta object paths and print the list of folders that contain Delta tables.
import boto3
s3 = boto3.resource('s3')
bucket_name = 'your-bucket-name'
prefix = 'path/to/folders'
# Get the S3 bucket and the objects under the specified prefix
bucket = s3.Bucket(bucket_name)
objects = bucket.objects.filter(Prefix=prefix)
# Filter out only the objects that represent Delta tables
delta_objects = [obj.key for obj in objects if obj.key.endswith('_delta_log/')]
# Extract the folder names from the Delta object paths
delta_folders = [obj.split('_delta_log/')[0] for obj in delta_objects]
print(delta_folders)
References: https://boto3.amazonaws.com/v1/documentation/api/latest/guide/collections.html
Hope this helps.
Thanks & Regards,
Nandini
โ03-14-2023 05:15 AM
Hi @Nandini Nโ,
Thank you for your answer which includes also an example.
But for my use case, I need to do it using Scala.
Could you please point me in the direct direction of how to implement this?
Thank you again,
Ovi
โ03-14-2023 05:35 AM
This is an elegant one, it uses isDeltaTable() from DeltaTableUtils and dbutils.
import org.apache.spark.sql.delta.DeltaTableUtils
val s3Path = "s3://my-bucket/my-folder"
// Get a list of all the folders in the S3 path
val folders = dbutils.fs.ls(s3Path).map(_.path)
// Filter out any folders that are Delta tables
val nonDeltaFolders = folders.filter(path => !DeltaTableUtils.isDeltaTable(path))
// Print the resulting list of folders
nonDeltaFolders.foreach(println)
Note: this is also available in python (package DeltaTableUtils is DeltaTable)
Thanks & Regards,
Nandini
โ03-14-2023 05:48 AM
Nice solution @Nandini Nโ !
Thanks a lot for pointing it out to me!
Have a nice day,
Ovi
โ03-14-2023 05:56 AM
Kudos @Ovidiu Eremiaโ , it would really help us if you can select the best answer ๐
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group