cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

MoJaMa
by Valued Contributor II
  • 7227 Views
  • 8 replies
  • 2 kudos
  • 7227 Views
  • 8 replies
  • 2 kudos
Latest Reply
User15848365773
New Contributor II
  • 2 kudos

Hi @amitca71 @atanu .. yes you can associate as many vpcs(workspace deployment fundamental) across regions and aws accounts to one single databricks aws account infact its one of the super powers of databricks platform and you can even track all thei...

  • 2 kudos
7 More Replies
spabba
by New Contributor II
  • 992 Views
  • 1 replies
  • 2 kudos

How to change databricks email alert notifications subject line?

Currently, our email notification subject shows for error in below format:<[AWS Account]> Error in run <RUN ID> of <Job Name>In our current databricks environment we have multiple environment jobs such as Dev/QA/UAT/Prod and it is very hard for us to...

  • 992 Views
  • 1 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Sanath Pabba​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.

  • 2 kudos
sawya
by New Contributor II
  • 2047 Views
  • 3 replies
  • 0 kudos

Migrate workspaces to another AWS account

Hi everyone,I have a Databricks workspace in an AWS account that I have to migrate to a new AWS accountDo you know how I can do it ? Or it's better to recreate a new one and move all the workbooks and if I choose to create one new how can you export ...

  • 2047 Views
  • 3 replies
  • 0 kudos
Latest Reply
Abishek
Valued Contributor
  • 0 kudos

@AMADOU THIOUNE​ Can you check the below link to export the run jobs? https://docs.databricks.com/jobs.html#export-job-runs. Try to reuse the same job_id with the /update and /reset endpoints, it should allow you much better access to previous run re...

  • 0 kudos
2 More Replies
MohitAnchlia
by New Contributor II
  • 1008 Views
  • 1 replies
  • 1 kudos

Change AWS storage setting and account

I am seeing a super weird behaviour in databricks. We initially configured the following: 1. Account X in Account Console -> AWS Account arn:aws:iam::X:role/databricks-s3 2. We setup databricks-s3 as S3 bucket in Account Console -> AWS Storage 3. W...

  • 1008 Views
  • 1 replies
  • 1 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 1 kudos

Hi @ MohitAnchlia! My name is Kaniz, and I'm the technical moderator here. Great to meet you, and thanks for your question! Let's see if your peers on the Forum have an answer to your questions first. Or else I will follow up shortly with a response.

  • 1 kudos
samalexg
by New Contributor III
  • 16237 Views
  • 13 replies
  • 1 kudos

How to add environment variable

Instead of setting the AWS accessKey and secret Key in hadoopConfiguration, I would like to add those in environment variables AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY. How can I do that in databricks?

  • 16237 Views
  • 13 replies
  • 1 kudos
Latest Reply
jric
New Contributor II
  • 1 kudos

It is possible! I was able to confirm that the following post's "Best" answer works: https://forums.databricks.com/questions/11116/how-to-set-an-environment-variable.htmlFYI for @Miklos Christine​  and @Mike Trewartha​ 

  • 1 kudos
12 More Replies
Labels