Databricks-Autoloader-S3-KMS
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-14-2023 12:46 AM
Hi, I am working on a requirement where I am using autoloader in a DLT pipeline to ingest new files as they come.
This flow is working fine. However I am facing an issue, when we have the source bucket an s3 location, since the bucket is having a SSE-KMS setup and when it tries to write schemaLocation, it is throwing an error.
What extra changes need to be done, I have provided the kms related permissions to the instance profile that is getting used. What changes I need to make in my notebook/pipeline to get it working.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-14-2023 10:50 AM
Can you please paste the exact errors and check below things:
check following if its related to KMS:
1. IAM role policy and KMS policy should have allow permissions
2. Did you use extraConfig while mounting the source-s3 bucket:
If you have used IAM role while mounting the bucket. Verify iam-role-arn has KMS related permissions::