Unity Catalog with Structured Streaming
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-12-2024 01:32 AM - edited 12-12-2024 01:33 AM
Hi,
Our project uses spark structured streaming scala notebooks to process files stored in an S3 bucket, with the jobs running in Single User access mode.
For one of the jobs, we need to use a file arrival trigger. To enable this, the S3 location must be registered as an external location in the Unity Catalog.
My question is: if we register the entire S3 bucket (instead of only the specific location needed), can we directly access bucket delta tables like this:
spark.readStream
.format("delta")
...
.load("s3://my-bucket/delta-table-1/")
.writeStream
.format("delta")
...
.option("checkpointLocation", aggregatedEvidencesCheckpoint)
.trigger(Trigger.AvailableNow())
.start("s3://my-bucket/delta-table-2/")
When directly accessing the bucket ("s3://my-bucket/..."), is the metadata for the Delta tables stored in the bucket itself, or is it managed through the Unity Catalog?
- Labels:
-
Spark
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-12-2024 05:02 AM
When you register an S3 bucket as an external location in Unity Catalog, you can directly access Delta tables stored in that bucket using the spark.readStream and spark.writeStream methods. The metadata for the Delta tables is managed through Unity Catalog, not stored in the bucket itself. This means that Unity Catalog handles the metadata management, ensuring that access control and other governance policies are enforced.
Here is the code snippet you provided, which will work as expected:
spark.readStream
.format("delta")
.load("s3://my-bucket/delta-table-1/")
.writeStream
.format("delta")
.option("checkpointLocation", aggregatedEvidencesCheckpoint)
.trigger(Trigger.AvailableNow())
.start("s3://my-bucket/delta-table-2/")

