- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-01-2022 04:27 PM
Hey everyone!
I'm building a DLT pipeline that reads files from S3 (or tries to) and then writes them into different directories in my s3 bucket. The problem is I usually access S3 with an instance profile attached to a cluster, but DLT does not give me the option to use an instance profile for the job cluster it creates.
What is the solution here? Do I somehow have to pass my AWS keys in the DLT notebook?
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-02-2022 06:49 AM
{
"clusters": [
{
"label": "default",
"aws_attributes": {
"instance_profile_arn": "arn:aws:..."
}
},
{
"label": "maintenance",
"aws_attributes": {
"instance_profile_arn": "arn:aws:..."
}
}
]
}
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-02-2022 03:53 AM
hi @Quinn Harty If you need an instance profile or other configuration to access your storage location, specify it for both the default cluster and the maintenance cluster.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-02-2022 06:49 AM
{
"clusters": [
{
"label": "default",
"aws_attributes": {
"instance_profile_arn": "arn:aws:..."
}
},
{
"label": "maintenance",
"aws_attributes": {
"instance_profile_arn": "arn:aws:..."
}
}
]
}
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-17-2022 03:15 PM
Hi @Quinn Harty,
Just a friendly follow-up. Did any of the responses help you to resolve your question? if it did, please mark it as best. Otherwise, please let us know if you still need help.

