by
spabba
• New Contributor II
- 1345 Views
- 1 replies
- 2 kudos
Currently, our email notification subject shows for error in below format:<[AWS Account]> Error in run <RUN ID> of <Job Name>In our current databricks environment we have multiple environment jobs such as Dev/QA/UAT/Prod and it is very hard for us to...
- 1345 Views
- 1 replies
- 2 kudos
Latest Reply
Hi @Sanath Pabba Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.
by
sawya
• New Contributor II
- 2649 Views
- 3 replies
- 0 kudos
Hi everyone,I have a Databricks workspace in an AWS account that I have to migrate to a new AWS accountDo you know how I can do it ? Or it's better to recreate a new one and move all the workbooks and if I choose to create one new how can you export ...
- 2649 Views
- 3 replies
- 0 kudos
Latest Reply
@AMADOU THIOUNE Can you check the below link to export the run jobs? https://docs.databricks.com/jobs.html#export-job-runs. Try to reuse the same job_id with the /update and /reset endpoints, it should allow you much better access to previous run re...
2 More Replies
- 1192 Views
- 0 replies
- 1 kudos
I am seeing a super weird behaviour in databricks. We initially configured the following:
1. Account X in Account Console -> AWS Account arn:aws:iam::X:role/databricks-s3
2. We setup databricks-s3 as S3 bucket in Account Console -> AWS Storage
3. W...
- 1192 Views
- 0 replies
- 1 kudos
- 19364 Views
- 13 replies
- 1 kudos
Instead of setting the AWS accessKey and secret Key in hadoopConfiguration, I would like to add those in environment variables AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY.
How can I do that in databricks?
- 19364 Views
- 13 replies
- 1 kudos
Latest Reply
It is possible! I was able to confirm that the following post's "Best" answer works: https://forums.databricks.com/questions/11116/how-to-set-an-environment-variable.htmlFYI for @Miklos Christine and @Mike Trewartha
12 More Replies