โ08-01-2022 04:27 PM
Hey everyone!
I'm building a DLT pipeline that reads files from S3 (or tries to) and then writes them into different directories in my s3 bucket. The problem is I usually access S3 with an instance profile attached to a cluster, but DLT does not give me the option to use an instance profile for the job cluster it creates.
What is the solution here? Do I somehow have to pass my AWS keys in the DLT notebook?
โ08-02-2022 06:49 AM
{
"clusters": [
{
"label": "default",
"aws_attributes": {
"instance_profile_arn": "arn:aws:..."
}
},
{
"label": "maintenance",
"aws_attributes": {
"instance_profile_arn": "arn:aws:..."
}
}
]
}
โ08-02-2022 03:53 AM
hi @Quinn Hartyโ If you need an instance profile or other configuration to access your storage location, specify it for both the default cluster and the maintenance cluster.
โ08-02-2022 06:49 AM
{
"clusters": [
{
"label": "default",
"aws_attributes": {
"instance_profile_arn": "arn:aws:..."
}
},
{
"label": "maintenance",
"aws_attributes": {
"instance_profile_arn": "arn:aws:..."
}
}
]
}
โ08-17-2022 03:15 PM
Hi @Quinn Hartyโ,
Just a friendly follow-up. Did any of the responses help you to resolve your question? if it did, please mark it as best. Otherwise, please let us know if you still need help.
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now