yesterday
I only have an AWS Access Key ID and Secret Access Key, and I want to use this information to access S3.
However, the official documentation states that I need to set the AWS_SECRET_ACCESS_KEY and AWS_ACCESS_KEY_ID environment variables, but I cannot find a way to set these two environment variables.
documentation: Connect to Amazon S3 | Databricks on Google Cloud
yesterday - last edited yesterday
Hi @liu ,
The proper way is to go to your cluster and in advanced section you can set them up. In that way they will be scoped at cluster level. It's recommended to store values itself in a secret scopes as environment variables:
Use a secret in a Spark configuration property or environment variable | Databricks on AWS
But you can also configure it at a notebook scope. I think following python snippet will be sufficient:
import os
os.environ["AWS_ACCESS_KEY_ID"] = "your-access-key"
os.environ["AWS_SECRET_ACCESS_KEY"] = "your-secret-key"
yesterday
Thank you very much for your answer,
But I can't use other cluster , I can only use serverless, can I set them for serverless?
I also configure it at a notebook scope, but they are not working properly at the moment. I have been told that I do not have permission, and I am still investigating.
yesterday
Sorry, somehow I didn't notice serveless in a thread title ๐ But I guess setting env variables at notebook scope should work.
One question, is there any reason why can't you use UC? The above way is a depracted method of configuring storage.
yesterday
Sorry, I'm a newbie. Currently, I can only append S3 to external data through role. For your suggestion about using UC, can I convert the CSV file in S3 into a volume in UC only through id and key, or is there another method
yesterday - last edited yesterday
@liu , I guess this could be related to Serveless limitation. In documentation they are saying that you must use Unity Catalog to connect to external data sources. Probably that's why you can connect ๐
yesterday
@szymon_dybczak
Thank you very much for your answer
I will try my best to link the content of S3 with UC
Once more, thank you
13 hours ago
No problem @liu , if you need some help with setting up UC we are here to help ๐
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now