Hey everyone!
I'm building a DLT pipeline that reads files from S3 (or tries to) and then writes them into different directories in my s3 bucket. The problem is I usually access S3 with an instance profile attached to a cluster, but DLT does not give me the option to use an instance profile for the job cluster it creates.
What is the solution here? Do I somehow have to pass my AWS keys in the DLT notebook?