Hi,
Our project uses spark structured streaming scala notebooks to process files stored in an S3 bucket, with the jobs running in Single User access mode.
For one of the jobs, we need to use a file arrival trigger. To enable this, the S3 location must be registered as an external location in the Unity Catalog.
My question is: if we register the entire S3 bucket (instead of only the specific location needed), can we directly access bucket delta tables like this:
spark.readStream
.format("delta")
...
.load("s3://my-bucket/delta-table-1/")
.writeStream
.format("delta")
...
.option("checkpointLocation", aggregatedEvidencesCheckpoint)
.trigger(Trigger.AvailableNow())
.start("s3://my-bucket/delta-table-2/")
When directly accessing the bucket ("s3://my-bucket/..."), is the metadata for the Delta tables stored in the bucket itself, or is it managed through the Unity Catalog?