cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Filter only Delta tables from an S3 folders list

Ovi
New Contributor III

Hello everyone,

From a list of folders on s3, how can I filter which ones are Delta tables, without trying to read each one at a time?

Thanks,

Ovi

1 ACCEPTED SOLUTION

Accepted Solutions

NandiniN
Databricks Employee
Databricks Employee

This is an elegant one, it uses isDeltaTable() from DeltaTableUtils and dbutils.

import org.apache.spark.sql.delta.DeltaTableUtils
 
val s3Path = "s3://my-bucket/my-folder"
 
// Get a list of all the folders in the S3 path
val folders = dbutils.fs.ls(s3Path).map(_.path)
 
// Filter out any folders that are Delta tables
val nonDeltaFolders = folders.filter(path => !DeltaTableUtils.isDeltaTable(path))
 
// Print the resulting list of folders
nonDeltaFolders.foreach(println)

Note: this is also available in python (package DeltaTableUtils is DeltaTable)

Thanks & Regards,

Nandini

View solution in original post

5 REPLIES 5

NandiniN
Databricks Employee
Databricks Employee

Hello @Ovidiu Eremiaโ€‹ ,

To filter which folders on S3 contain Delta tables, you can look for the specific files that are associated with Delta tables. Delta Lake stores its metadata in a hidden folder named

_delta_log, which is located at the root of the Delta table. So, you can check for this folder.

In the below code, we first get the S3 bucket and the objects under the specified prefix. We then filter out only those objects that represent Delta tables by checking if their keys end with _delta_log/. Finally, we extract the folder names from the Delta object paths and print the list of folders that contain Delta tables.

import boto3
 
s3 = boto3.resource('s3')
bucket_name = 'your-bucket-name' 
prefix = 'path/to/folders'
 
# Get the S3 bucket and the objects under the specified prefix
bucket = s3.Bucket(bucket_name)
objects = bucket.objects.filter(Prefix=prefix)
 
# Filter out only the objects that represent Delta tables
delta_objects = [obj.key for obj in objects if obj.key.endswith('_delta_log/')]
 
# Extract the folder names from the Delta object paths
delta_folders = [obj.split('_delta_log/')[0] for obj in delta_objects]
 
print(delta_folders)

References: https://boto3.amazonaws.com/v1/documentation/api/latest/guide/collections.html

Hope this helps.

Thanks & Regards,

Nandini

Ovi
New Contributor III

Hi @Nandini Nโ€‹,

Thank you for your answer which includes also an example.

But for my use case, I need to do it using Scala.

Could you please point me in the direct direction of how to implement this?

Thank you again,

Ovi

NandiniN
Databricks Employee
Databricks Employee

This is an elegant one, it uses isDeltaTable() from DeltaTableUtils and dbutils.

import org.apache.spark.sql.delta.DeltaTableUtils
 
val s3Path = "s3://my-bucket/my-folder"
 
// Get a list of all the folders in the S3 path
val folders = dbutils.fs.ls(s3Path).map(_.path)
 
// Filter out any folders that are Delta tables
val nonDeltaFolders = folders.filter(path => !DeltaTableUtils.isDeltaTable(path))
 
// Print the resulting list of folders
nonDeltaFolders.foreach(println)

Note: this is also available in python (package DeltaTableUtils is DeltaTable)

Thanks & Regards,

Nandini

Ovi
New Contributor III

Nice solution @Nandini Nโ€‹ !

Thanks a lot for pointing it out to me!

Have a nice day,

Ovi

NandiniN
Databricks Employee
Databricks Employee

Kudos @Ovidiu Eremiaโ€‹ , it would really help us if you can select the best answer ๐Ÿ™‚

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group