โ01-10-2023 04:00 AM
Hi All, I am trying to read streams directly from AWS S3. I set the instance profile , but when i run the workflow it fails with below error
"No AWS Credentials provided by TemporaryAWSCredentialsProvider : shaded.databricks.org.apache.hadoop.fs.s3a.CredentialInitializationException: Access key, secret key or session token is unset: "
I added below to my cluster
fs.s3a.aws.credentials.provider org.apache.hadoop.fs.s3a.TemporaryAWSCredentialsProvider
fs.s3a.access.key <AccessKeyId>
fs.s3a.secret.key <SecretAccessKey>
It still fails with same error. Could some one please help me how to pass these for DLT workflows
โ01-12-2023 02:07 AM
Hi Vivian,
Thanks for your help. I am happy to inform that it's working now . I think problem was in assigning proper roles and access to instance profile(in AWS) which i created for this purpose . Once i added few more rules , it started working .
Thanks again for all your help.
โ01-10-2023 06:54 AM
Hi @SUDHANSHU RAJโ is UC enabled on this workspace? What is the access mode set on the cluster?
Is this coming from the metastore or directly when you read from S3? Is the S3 cross-account?
โ01-10-2023 07:42 AM
Dear Vivian,
UC is not enabled on this workspace . I am using Instance profile set up as per databricks document .
S3 is set up for cross account and as i said , i am able to run dbutils.fs.ls("s3a://zuk-comparis-poc/")
But when i run the workflow which envoks a delta notebook ,it gives me this error.
This is a standard cluster , so i have not enabled IAM passthrough.
Am i missing something . Thanks in advance
โ01-11-2023 04:27 AM
@SUDHANSHU RAJโ Can you please share the pipeline settings in JSON and also the cluster policy JSON? If this works on a standard cluster but not from a DLT pipeline, we need to verify the DLT pipeline settings for the cluster.
โ01-12-2023 02:07 AM
Hi Vivian,
Thanks for your help. I am happy to inform that it's working now . I think problem was in assigning proper roles and access to instance profile(in AWS) which i created for this purpose . Once i added few more rules , it started working .
Thanks again for all your help.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group