Instead of setting the AWS accessKey and secret Key in hadoopConfiguration, I would like to add those in environment variables AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY.
How can I do that in databricks?
Yes. I set the env variables as mentioned, but one difference. I've not set it based on cluster. I've added init scripts in dbfs:/databricks/init/ itself? Will that makes the difference?
No. I've just set my env variables in the script. Currently our cluster is in production and I can't test it now. Once I got access I will add some echo and let you know the result
Thanks for your response. I want to add one more environment variable apart from the AWS properties. I just created the init script as mentioned in documentation. I restarted my cluster. And when I read the env variable like sys.env.get(envName); in...